Healthcare providers are often reluctant to adopt new technology, and not without good reason. But in the case of conversational AI and chatbots, the advantages certainly justify the cost and effort.
These five use cases illustrate how AI-powered chatbots can contribute to saving time and money while delivering a superior patient experience.
Like other healthcare technology, chatbots are subject to multiple regulations. Most importantly, the Healthcare Information Portability and Accountability Act (HIPAA) sets rigorous standards for maintaining patient privacy and outlines which patient information is protected. It specifically outlines protected healthcare information (PHI) and how that information must be handled, secured, and accessed.
According to HIPAA Journal, PHI includes “individually identifiable information relating to the past, present, or future health status of an individual that is created, collected, or transmitted, or maintained by a HIPAA-covered entity in relation to the provision of healthcare, payment for healthcare services, or use in healthcare operations.” That would certainly include references to the following:
That includes quite a lot of information that might be exchanged during a patient’s conversation with a chatbot. For example, even in making an appointment, a patient might share current symptoms or a past diagnosis. For that reason, it is critical that healthcare providers choose HIPAA-compliant chatbots.
Meanwhile, if a chatbot doesn’t comply with HIPAA, it could still be used for a few functions:
Those options are quite limited, and they leave the true capabilities of a chatbot untapped. Between the inevitability of PHI being exchanged via chatbot, and the potential benefits a chatbot can bring, it is worth investing in a HIPAA-compliant chatbot.
One challenge of HIPAA is that it doesn’t actually specify how a chatbot or other conversational AI technology can achieve compliance. The law simply hasn’t kept up with the evolution of technology. That said, if your chatbot has these key features, it will also likely be HIPAA compliant.
HIPAA compliance doesn’t start with the chatbot design--it really starts with your provider.
A Business Associate Agreement (BAA) certifies that your vendor follows the same policies and procedures that govern your organization. In this case, the BAA should certify that the vendor follows all data privacy requirements as outlined in HIPAA, and it should specifically outline exactly how the vendor will handle and protect your patients’ PHI.
As conversational AI and chatbot technology have become more ubiquitous, the best providers have gained great experience in safeguarding sensitive medical data. Look for a chatbot provider that offers enhanced support for BAAs and doesn’t shy away from third-party security evaluations.
When we talk about HIPAA, we tend to focus on data security. That’s certainly critical, but another important aspect of this regulation is that patients’ data must be accessible to them. In the context of a chatbot, a chat between patient and bot qualifies as health data. Thus a transcript must be available to the patient following a conversation--but not automatically.
To ensure that only the intended patient receives this information, ensure that you can turn off the automated dissemination of chat transcripts. Choose a chatbot provider that offers multi-factor authentication for patients to receive their data, so that they must verify their identity to receive any information.
Transcripts are a fairly standard feature of chatbots and conversational AI technology and usually doesn’t present a compliance challenge. To ensure that they’re always available, opt for a provider who stores data in the cloud, rather than on-premise.
Meanwhile a sometimes overlooked aspect of data availability is uptime. If your chatbot is offline, its data won’t be available to your patients. Be sure to ask any prospective chatbot providers about their uptime rates and whether they’ve ever suffered a data breach. Ideally, your SLA will include a guarantee of 99.5% uptime.
Patients should have easy access to their PHI. But your staff shouldn’t. Patient data should be accessible on a “need to know” basis, meaning that only those with permission and reason to see a patient’s data should have access to it.
To accomplish this with your chatbot software, you’ll need to set up different user roles with the appropriate permissions. Furthermore, it’s important to protect access to the system itself. Security measures would ideally include some combination of multi-factor authentication; single sign-on (SSO); and IP whitelisting. These should be coupled with a robust password policy and thorough cybersecurity training.
In addition to access controls, your chatbot software should also offer audit controls. The most important is to have a complete log of every user action in the chatbot system. There should be a “trail” of who accessed which chat, and when. Again, transcripts partially fulfill this requirement, since they tell you exactly who participated in the chat. But there should also be a means to track whenever a user subsequently accesses that transcript.
Data security is, of course, the lynchpin of HIPAA compliance. Regulations require that data be secure at all times, both during transmission and at rest (that is, when it’s simply being stored). The simplest aspect of this requirement is data sovereignty, that is, that US healthcare data must be stored in the US.
Next, data should be encrypted at every stage. Chatbot messages should be encrypted during transmission (both chatbot to user and chatbot to chatbot) and when they’re stored as transcripts. Ask prospective chatbot providers about their encryption practices to ensure that they are compliant with this HIPAA requirement.
And finally, HIPAA mandates that the facility where your chatbot data is stored should provide a “high level of physical security.” Major cloud providers like AWS and Azure will meet this requirement. If your chatbot partner stores data privately or uses a smaller provider, be sure to get documentation on their risk assessment process and system controls.