Angela Chen from The Verge offer a good analysis of Amazon's announcement that Alexa-enabled devices are now able to handle patient information.
“A lot of good could come out of this change. This could be really, really beneficial for consumers. Historically, the harder it is to see a doctor, the less likely someone is to get care. The less access someone has to things that could improve their health, the less likely they are to tweak it.”
That's great news, but what does this mean for privacy? How will Amazon use the data it now has access to?
“Now that Alexa is allowed to handle this data, what can it do with it? The answer depends on the specific agreements between Amazon and a given partner — which means that there’s a lot we don’t know yet.
It is possible that patient information could be shared and used to train one of Amazon’s artificial intelligence algorithms. The HIPAA Privacy Rule does require written authorization before someone’s health information can be used for marketing, but what constitutes “marketing” is not as straightforward as one might think.
A lot comes down to data use agreements between Alexa and its partners.”
The average user rarely – if ever – reads the end-user agreements that come with apps and cloud-based systems, let alone be interested in exploring and reading the B2B agreements between Amazon and its business partners. This is a matter of data transparency.
The risk is that the end-user will never know whether and how a very large organization like Amazon will sell and use their health information. To understand the detail and sensitivities of the data shared in such agreements, have a look at the video below to get a sense on the secrets hidden inside your voice – and how they could be used/misused in a medical or corporate/governmental surveillance context.
Healthcare is now on Amazon's disruption map. Voice is part of the weaponry. Next will be virtual reality. No vertical is safe anymore.
Guest post by The Futures Agency content curator Petervan