Wednesday, January 14, 2015

mHealth Revisited

Picture it, about 300 years into the future.  Humans have finally left the confines of Earth and are roaming the galaxy.  Upon encountering a new planet they decide to send an away team.  On the planet, unfortunately, one of the team members is exposed to a lethal dose of radiation.  The accompanying medical officer pulls out his medical tricorder.  This device not only can scan the person to check vitals and cause of illness or injury, it also sends this important information back to the central computer on the starship for possible therapeutic advice, past medical records for the individual, and it collects data for future care.


Ok, by now I am sure you can tell that I am a Trekkie.  A proud one!  But does Gene Roddenberry paint a picture that far off the mark, or could this be a possibility one day?

Lets bring our timeline back to January 13, 2015.  At work I stopped to talk to a coworker.  She proudly showed me a new device that she had purchased that tracked her steps, calories burned and some other data was collected.  Ok, so this isn’t as sophisticated as Star Trek’s medical tricorder, but it is just the beginning of a new frontier called mobile health or mHealth.


Mobile health is still in its infancy, however it is growing swiftly.  With the proliferation of smart phones and tablet devices, we are now able to take more technology with us.  As a matter of fact, according to Howard Larkin (2011), app stores for many of the popular smart devices, collectively held over 17,000 medical related apps in 2011 (Larkin, 2011, p. 24).  Nowadays you can get apps that track your running times and speeds, apps that monitor the way you sleep and even apps that act as your own personal trainer in the gym.  And these are just the consumer apps.  Milosevic, Milenkovic and Jovanov (2013) showed us, how UAH had created an app that allowed them monitor the movement of people with a “fall risk”, these being named sTUG and mWheelness (Milosevic et al, 2013, p. 47).  These apps were used in specific studies to gather information for care of future patients.


This new way of gathering information and providing therapy is important because it is paving the way to collecting mountains of valuable data that can be used for future treatments and preventative care.  There are some cases where it is being used to help people in remote parts of the world, with little access to healthcare professionals.  It is helping to prevent sedentary lifestyles for some and becoming the first line of defense against illness.  What about the future, you ask?


The future looks bright for these technologies.  Just recently, with the release of iPhone 5s in 2013, Apple included a new microchip specifically designed for “health applications.”  There has also been a surge of wearable devices to create a BAN for those who want it.  What about Roddenberry’s tricorder?  Well according to Adam Pliskin (2014), Google, taking a cue from Star Trek, intended to create a system like the medical tricorder of that television series (Pliskin, 2014).


How is that for “going where no man has gone before?”

References:

Larkin, H. (2011). mHealth. Hospitals & Health Networks, 85(4), 22-6, 2. Retrieved from http://search.proquest.com/docview/865328282?accountid=14552


Milosevic, M., Milenkovic, A., & Jovanov, E.  (2013).  mHealth @ UAH: Computing infrastructure for mobile health and wellness monitoring.  XRDS, 20(2), pp.43-49.  DOI: 10.1145/2539269


Pliskin, A. (2014).  Google X Aims To Build ‘Star Trek’ Tricorder And Change Healthcare. Elite Daily. Retrieved from http://elitedaily.com/news/technology/google-x-building-tricorder/834907/

Wednesday, October 2, 2013

Barriers to Accountability in Health Information Technology

Let’s go over the barriers from Nissenbaum (1996).  These are “the problem of many hands”, “bugs”, “the computer as a scapegoat” and “ownership without liability” (Nissenbaum, 1996).  Let us discuss each in detail and remedies for each.

Problem of many hands
The issue with this and how it is a barrier is this.  According to Dennis Thompson (2011), when things happen we want to look for someone to hold responsibility.  In the case of large organizations (ie. Healthcare Vendors) it can be very difficult to pinpoint one person because in these organizations, many persons are involved and by pinpointing a single person it could be an unfair judgment, if they could not have prevented the event (Thompson, 2011).  On the other hand, if we hold the entire organization accountable, this could unfairly affect people that are not responsible at all (Thompson, 2011).  Thompson proposes one way around this barrier is “prospective design responsibility” (Thompson, 2011).  This is making an independent body responsible for the design of the processes that the organization must follow in order to monitor production and to avert malfunctions.  This body would then be the ones held responsible if they did not fix broken processes or ignore warnings of failure (Thompson, 2011).

Bugs
Bugs, or coding errors (or as a company I worked for called them “undocumented features”) are part of the natural process in coding computer applications.  If one were to count the lines of code for any large, complex applications, one would most likely count up into the millions of lines of codes.  Inevitably, there will be errors in the code.  These errors can be simple mistakes, things brought out by other features or on the more sinister side, they may be known and ignored.  When these cause issues they can become a barrier to accountability because one can say “it’s just a bug in the software that was overlooked.”  So how can there be accountability for the bugs that were intentionally ignored or even created?  Most of the outside sources that I found mentioned Nissenbaum, so here is what she has to say.  Basically, we know that bugs are a natural part of coding.  Here lies the problem.  Many times this causes people to use this as an excuse for sloppiness and incompletion (Nissenbaum, 1996).  This means that people need to take more accountability in their coding and testing of the software in order to keep bugs to a minimum.

The computer as the scapegoat
At times it may be easy for someone to blame the computer for faults.  This can be a plausible explanation, since we know that there are inherently bugs in any system.  Back in school I remember a saying, popular with computer-minded individuals.  “Garbage in, garbage out.”  This meant that in many cases, errors are attributed to the user and not the computer.   The information or commands given to the machine are what is responsible for the action it takes.  According to Friedman and Millett (1995), the reason that some people blame computers is because of the perceived decision making capabilities of these machines (Friedman & Millett, 1995).  The go on to say that “designers should communicate through the system that a (human) who -- and not a (computer) what -- is responsible for the consequences of the computer use” (Friedman & Millett, 1995). 

Ownership without liability
Nissenbaum discusses the barrier and states “along with privileges and profits of ownership comes responsibility” (Nissenbaum, 1996).  To me this means that if you are benefiting you should be held responsible for any errors or mishaps resulting from use.  This means both the vendor and the purchaser.  Unfortunately in the software industry it is becoming a trend to deny accountability of software produced while retaining “maximal property protection” (Nissenbaum, 1996).  The way around this barrier is through contracts and particular attention to End User Agreements.


References:
Friedman, B & Millett, L.  (1995).  "It's the Computer's Fault" -- Reasoning About Computers as Moral Agents.  Retrieved from http://www.sigchi.org/chi95/proceedings/shortppr/bf2_bdy.htm
Nissenbaum, H. (1996). Accountability in a Computerized Society. Science and Engineering Ethics, 2(1), pp. 25-42. DOI: 10.1007/BF02639315
Thompson, D.  (2011 Jan 28).  Designing Responsibility: The Problem of Many Hands in Complex Organizations.  Harvard University.  Retrieved from http://scholar.harvard.edu/files/dft/files/designing_responsibility_1-28-11.pdf

Tuesday, September 3, 2013

Conflicting Moral Priorities Resulting from Cultural and/or Religious Diversity

How does one decide on which moral norms should prevail in the clinical setting?

I actually found that this topic was very interesting.  Right from Chapter 1, Beauchamp and Childress (2009) state that "particular moralities…contain moral norms that are not shared by all cultures, groups and individuals" (p. 5).   To me this means that at times some may feel it is necessary to introduce their particular norms into the clinical setting, as held by their particular beliefs or culture.  This reminds me of a surgery that I underwent a few years back.  Without going into too many details, the condition was a result of certain life activities that I had been involved in.  One of the nurses took it upon herself to relay her Christian beliefs to me and to basically tell me how I needed to change my life based on her particular moral beliefs.  At the time, and because I was under duress, I did not pay much attention to her, but now I see that what she was doing had no place in the clinical setting.  You see, even though her beliefs where relevant to her, this could have caused tension between the medical facility and myself.  Richard Sloan wrote an article in the LA Times and expressed this sentiment; "we all are free to practice our religion as we see fit, as long as we do not interfere with the well-being of others by imposing our religious views on them."  He went on to say "Freedom of religion is a cherished value in American society. So is the right to be free of religious domination by others. 

So the question is which moral norms should this nurse have chosen?  Well certainly, her particular moral norms should have no place in the clinical setting.  I believe that in this setting, because of the influx of such a wide diversity of persons coming and going, all professionals should adhere to the common morality and if a particular morality is chosen it should be professional moralities that define the guidelines for health care professionals.

Beauchamp and Childress (2009) lay out 10 examples of moral character traits when it comes to the common morality.  These are non-malevolence, honesty, integrity, conscientiousness, trustworthiness, fidelity, gratitude, truthfulness, lovingness and kindness.  These are the types of traits that should be displayed first and foremost in the clinical setting.  When it comes to professional ethics, eHow.com contributor Stephanie Mitchell puts it this way, "Codes of ethics come into play when simply knowing the difference between right and wrong is not enough, and such situations arise around patients' rights, patients' dignity, equitable access to treatment and the development of new medical technologies. Medical codes of ethics help ensure that healthcare professionals make the best possible choices when faced with difficult decisions."


Beauchamp, T. L., & Childress, J. F.  (2009).  Principles of Biomedical Ethics (7th ed.).  New York, NY: Oxford University Press.

Mitchell, S. (n.d.).  The Purpose of Professional Ethics in Healthcare.  eHow.com.  Retrieved from http://www.ehow.com/info_8404364_purpose-professional-ethics-healthcare.html

Sloan, R. P.  (2008 August 23).  When religion and healthcare collide.  LA Times.  Retrieved from http://www.latimes.com/la-oe-sloan23-2008aug23,0,4637656.story

Saturday, January 26, 2013

Clinical Ancillary Applications Benefits and Limitations to Integration


Being new to healthcare, I had no clue what ancillary services were.  Your Dictionary, Medical (n.d.) defines these services as being "Relating to or being auxiliary or secondary."  So since these are secondary and supplementary to the primary medical functions, why would it be advantageous for them to have their own computer systems dedicated to their functions?  What benefits are there to using systems of this nature and what are the limitations?

First off let us discuss why these services should be automated and computerized and also what is the current state of technology in regards to clinical ancillary.  As part of an abstract for a paper, Michael Minear and Jeff Sutherland (2003) wrote the following:

Digital computers have been successfully incorporated into specialized clinical instruments to offer advanced digital devices such as fetal monitors, heart monitors, and imaging equipment. But these devices are often not fully integrated with clinical management and operational systems. Beyond ancillary department applications, the result of almost 30 years of trying to automate the clinical processes in health care is large investments in both computer systems and paper medical records that have resulted in paper-based, computer-assisted processes of care.  This expensive combination of partial clinical automation and archaic paper-based support processes is a major obstacle to improvements in care delivery and management. (Minear & Sutherland, 2003)

So as you can see not having these processes computerized is causing a disadvantage to patient care process, delivery and management.  For example of how the current systems are set up in some places, let us look at the operating room (OR).  The OR is an example of leading edge technology advances, but in many cases, in many HCOs, the ancillary systems are not integrated with each other or into the core systems (Minear & Sutherland, 2003).  One of the benefits of having computerized, integrated ancillary systems is having all of the "players" related to the patient care delivery process working as a synchronized unit (Minear & Sutherland, 2003).  If this is not the case then it is hard to stay on track, stay informed and the process will suffer.  

Another advantage of ancillary applications is creation of a knowledge base.  Minear and Sutherland (2003) said that, "Knowledge-enabled software is inherently flexible and delivers much more sophisticated support to clinicians."  Another advantage is the enforcement of standards throughout the organization (Minear & Sutherland, 2003).

One limitation is the "major amount of work to rewrite and test a new system that has no guarantee of satisfying all the needs of the ancillary services (Andrews, n.d.)."  Another is the coordination of so many dissimilar projects.  Other limitations are difference data definitions for storage, formal processes to integration taken over by committees and formulating a process for management of the data.

As you can see there are benefits (correlation and standardization of care and knowledge) with clinical ancillary applications.  There are also limitations especially when it comes to integrating these into a core system.  It is up to the HCO to determine if the benefits outweigh the limitations, and this can be based on many difference factors.


References:

Andrews, R. D.  (n.d.).  Integration of Ancillary Data for improved Clinical Use: A Prototype within the VA's DHCP.  Downloaded from http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=20&ved=0CHUQFjAJOAo&url=http%3A%2F%2Fpubmedcentralcanada.ca%2Fpmcc%2Farticles%2FPMC2245652%2Fpdf%2Fprocascamc00017-0608.pdf&ei=cLkEUfyHBMzdqAHj2oCIDg&usg=AFQjCNFcLdttxcfqEXbHyP44u8A_HFVV-g&sig2=au_3UNpsjJzUZdksizXJog&bvm=bv.41524429,d.aWM

Minear, M. N. & Sutherland, J.  (2003 June).  Medical Informatics-A Catalyst for Operating Room Transformation.  Seminars in Laparoscopic Surgery, 10(2), pgs. 71 - 78.  DOI: 10.1177/107155170301000203

Your Dictionary, Medical.  (n.d.).  ancillary medical definition.  Retrieved from http://medical.yourdictionary.com/ancillary



Tuesday, January 22, 2013

Dreaming of the (Medical Information Technology) Future




Have you watched Star Trek lately?  What is the future of healthcare?  Will it be "trekkie" in a few hundred years.  Mobile devices that scan you and send the information back to the central computer for AI to make decisions?  Hologram doctors?  Whats in store?  We can only dream!!

Friday, December 28, 2012

What's the Big Deal About Big Data?


My career has allowed me to learn quite a few things.  I have worked with software, testing it to ensure it met the requirement needs of clients. I have worked at a small cell phone company, trying to create a small and primitive data warehouse.  Now I work assisting in gaining adherence to ITIL methods for the entire government of a country.  These have all been great positions, but I have to say my first job out of college was by far my favourite.

At AIG, I was able to work with some very brilliant minds on the executive team, engage on some fraud reporting and saw my reports used throughout the United States.  I had a great manager, and awesome co-workers   However, by far the best part was working with data, reporting and writing SQL.  I loved it.  If you go back into my past a little further, you would see me asking my professor, after my class was finished for the semester, could I retain access to the database so that I could practice SQL more (can you say NERD!).

Nowadays my use of SQL is very limited and I miss it.  I miss reporting, seeing trends in data, and creating tools that guide decision making that can change the outcomes of the business and how everything interacts to create success.  So you should not be surprised that this term I have been hearing brings about a level of excitement for me.  It is Big Data.

OK, so I am sure you have heard of big data.  Maybe you know exactly what it is.  I am hoping to shed some light on this for people that have heard what it is, but have not had the time to look into it further.  Also I would like to convey why it will be important to healthcare.

What is Big Data?

When I was at AIG we had a data warehouse that had, most likely, millions and millions of records.  It was massive.  Big data is even bigger than that.  According to Wikipedia, big data is data sets so large that using standard database tools or data processing becomes very difficult.  The reason data is growing so much is because it is “increasingly being gathered by ubiquitous information-sensing mobile devices, aerial sensory technologies (remote sensing), software logs, cameras, microphones, radio-frequency identification readers, and wireless sensor networks."  Wikipedia goes on to say that, "The world's technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s; as of 2012, every day 2.5 quintillion (2.5×1018) bytes of data is created."

Wow, that is big!

This data is being used for data analytics, business intelligence and effective decision making.  Cliff Saran of ComputerWeekly.com alludes to a fact.  The bigger your data, the larger the gap grows in competitiveness that lies between you and your competitors.

With these advantages, come some challenges.  The biggest, ironically, is the size of the data.  Saran says that "big data will cause traditional practices to fail, no matter how aggressively information managers address dimensions beyond volume."  Another challenge is trying to understand how to use unstructured formats, such as text and video (McDonnell, 2011).  According to McDonnell, some other challenges include storage of this data and getting the most important data to the right people at the right time.  And of course there are going to be immense challenges when it comes to security and privacy.

Big Data in Healthcare

Is big data important to healthcare?  Irfan Khan (2012) says that if we estimate that each of our cells contains 1.5 GB of data, each one of us is walking around with approximately 150 zettabytes of data.  Thanks 150 billion terabytes.  So, it seems, big data is already in health care.  We just need to capture this data. By capturing this data there is a great potential for improved healthcare "including personalization of care, defining patient populations with a greater level of granularity, analysing unstructured data, mining claims data for insights that can improve wellness and patient compliance, advancing medical research, and helping governmental agencies detect fraud, identify best care delivery practices, and improve bio-surveillance (Terry, 2012)."

Big data has big possibilities, including better healthcare outcomes.  It also has big challenges, such as storage, privacy and how to work with new and unstructured data, to name a few.  The benefits outweigh the challenges, however.  mHealth, EHR and all of the technologies related to health informatics will continue to grow big data.  The big question is, are you ready?

----

Khan, I.  (2012 November 15).  Where's the big data in healthcare IT? Look in the mirror.  Retrieved from http://www.itworld.com/big-data/315298/where-s-big-data-healthcare-it-look-mirror

McDonnell, S.  (2011 June 21).  Big Data Challenges and Opportunities.  Retrieved from http://spotfire.tibco.com/blog/?p=6793

Saran, C. (n.d.).  What is big data and how can it be used to gain competitive advantage?  Retrieved from http://www.computerweekly.com/feature/What-is-big-data-and-how-can-it-be-used-to-gain-competitive-advantage

Terry, K.  (2012 October 15). Health IT Execs Urged To Promote Big Data.  Retrieved from  http://www.informationweek.com/healthcare/clinical-systems/health-it-execs-urged-to-promote-big-dat/240009034

Wikipedia.  (2012 December 23).  Big Data.  Retrieved from http://en.wikipedia.org/wiki/Big_data

Monday, December 24, 2012

Is cloud computing safe for healthcare?


So I am learning today one of the pitfalls of this new [old] technology called Cloud computing.  AWS, the cloud service from Amazon, has gone down.  How did I notice?  It took Netfilx down with it, meaning I am missing out on my Star Trek marathon I had planned for Christmas Eve.  So this has me thinking.  How can we trust cloud computing?  Can we trust it with healthcare, where there are so many life or death decisions made everyday?  What ways can be implemented to ensure uptime if using cloud computing in healthcare?  Any thoughts?