Envision computers

I’ve been challenged. I attended the PS Suite EMR User Conference May 29-31, 2014 in Markham, Ontario, and shared some ANGST (http://mdangst.com/2014/06/ps-suite-emr-u…nference-angst/). How can we envision computers working more like the human brain? Interestingly, most people have agreed that they feel jammed into a different way of thinking when using the EMR. I’ve got agreement, there.

 

So I’ve won the argument. End of blog.

brain: envsion computers Georg Holderied via Compfight

I’m a family MD, not a neurologist. A bit of a weird FP, to be sure, as you’re reading this blog. How many FPs do that? An artist (www.ronireland.com), so a bit further out on the curve. Then that M.Ed thing, from an arts faculty (Concordia, Portland) which shoved me out a bit more. Oh. I do hypnosis, too. OK, I’m truly out there.

 

Envision  computers like the human brain. Possible? NO. Let me count the ways. Forgive me if it isn’t a sonnet. The computer cannot see, hear, touch, or feel, emote. It cannot think creatively, perceive, self direct. It’s a tool, a computing tool, a “computer”. It’s not alive! It is utterly, completely unlike a brain, yet we think of it as one. When I discuss brain function with some, I use the computer as a comparison.

 

But it isn’t even close.

 

How can we nudge it further along the path? Or, at least, how can we evolve this tool enough so it can get out of our way, and truly help us? One of my mantras is, “Have patience, we’re the transition generation. Our kids won’t have to do this. Our efforts are making it better for them…”

 

I want it now. Out of my way, truly assisting me. Right now it feels like a car I’m pushing half the time. Sometimes out of snowbanks.

 

Let’s start with the easy ones. Envision computers… just better. Not human. But…

 

Touching. Recently, at that PS User Suite Conference, I encountered new technology from Welch Allyn that allowed measured vital signs, including temperature, oximetry, and HR/ BP, to go right onto the EMR. Equipment exists to do EKG similarly, holters, spirometry. That’s sort of feeling, touching. Sort of like the computer is reaching out to examine the patient! Scenes from Terminator! Envision computers with all this patient data, or a subset of it, available by the time you walked into the room?

 

Seeing. I don’t know about you, but one prerequisite of attaining a resident is a ceiling mounted camera. Do you tape some of your encounters? Residents tape themselves interviewing, or performing non genital exams. The gadget picks up on audio of course as well. Hearing. What if all that information went onto the EMR? I know memory for video is substantial, but aren’t they always telling us that memory is mushrooming?

 

Then I got thinking about Google glasses. One of my kids was watching one of those real life police shows, where there is a dash mounted camera recording the whole event. Probably those things are there to protect everybody from frivolous lawsuits, to do checks on technique, etc, sort of like our exam room cameras. What if the MD were wearing Google glasses, recording the whole interaction, video and audio? What if that went into the EMR, under the date seen, in a little icon one could click open if needed? Would that be too intrusive for the patient? What if we could turn video briefly for those private exams? Envision computers…on your head!

 

I started community practice in 1988, taking many patients over from a revered community family physician who wrote his charts on recipe cards. Years were documented on one side of one card. Entries were: date, diagnosis, drug. For example, Jan 14/70, URTI, Pen V. Usually no doses were documented. Occasionally an odd finding was jotted down, something like “HUGE tonsils”. Otherwise, it was pretty bare boned. It was really pretty standard for that time period. The physician in those days would have to rely on his memory to flesh in details if needed later.

 

How long do you think those records took to make?

 

I’m suggesting those brief chart progress notes are soon to be seen again. Apart from those automated vitals, they’ll be date, diagnosis, intervention. One line. And an icon to click to see the video/ audio loop if needed.

 

How do I foresee an interaction occurring? Why, just like today! Except that now, the physician will have to verbalize his physical findings during the exam, like, “no neck nodes palpable.” This will only help communication with the patient. The Glasses may even come with some reminders built in, or a measuring device to help with measuring angles of back flexion, etc. They could measure the time of the intervention, and remind the physician to consider billing a time based code if qualified. Leaving the room, the physician could verbalize the billing code, diagnosis, and get on to the next patient. Progress note charting, and billing would simply be done.

 

Computers, for all their shortcomings, are great at remembering. The videos would be a great improvement over our remembered interventions. Sometimes I’m forced to document at the end of the day. After 40 sometimes emotionally taxing encounters! Don’t you? Ever have to stretch those memory muscles pretty hard?

 

How does the human brain sort through data during the day? The left brain simply files most of the stuff away. It quickly labels, and brushes the desk clean. Now this whole process is ripe for distortion, but there’s something to be said for it. I spend about the same amount of time reading a consult letter that confirms my preconceived ideas of normalcy, or lack of pathology, as with a letter that hits alarm bells. One requires action, one filing. There must be some way of quickly identifying data that is more important, so that attention and time can be spent on those issues. Envision computers automatically listing and updating diagnoses!

 

It will probably require consultants electronically sending their consults, somehow in a standard fashion listing diagnoses, perhaps numerically…

 

Now that’s a hurdle!

 

And it would help if I just called my computer something. Some name. Make it more human.

 

Maybe, “HAL.”

iRobot Eye v2.o: envision computers Tc Morgan via Compfight

Leave a reply