Do you remember that Black Mirror episode where the protagonist forms an unhealthy attachment to a chat bot that was created using her dead husband’s data in order to mimic him? Well, just when you think the modern world couldn’t get any more like the dystopian sci-fi series, the world of technology proves you wrong.
People in the industry have recently spotted that, last month, Microsoft filed a patent for similar technology just last December 2020. Specifically, the patent is for the creation of a “conversational chat bot of a specific person.”
The patent further states in its abstract that the chat bot may access the social data to “create or modify a special index in the theme of the specific person’s personality” and that the special index may then be used to train the bot to converse based on that data. Said social data refers to a certain person’s “images, voice data, social media posts, electronic messages, written letters, etc.”
Beyond the Grave?
Of course, going off the patent alone, it isn’t like Microsoft’s main focus is stated to be the virtual resurrection of those who have passed away. However, it is understandably one of the main possibilities being raised. It’s a scenario one would typically jump to when thinking why you’ll need a chat bot to recreate the personality, speech patterns, and other behavioral aspects of a specific person.
The patent itself, after all, mentions that possibility when discussing the potential subjects whose data may be used to create chat bots. “The specific person that the chat bot is to mimic] may correspond to a past or present entity (or a version thereof), such as a friend, a relative, an acquaintance, a celebrity, a fictional character, a historical figure, a random entity etc.”
Have we put so much into our online persona that the chat bots could actually accurately recreate how we interact? Are we willing to give access to our data for that purpose in the first place? And how much preemptive control do we have over that decision if this technology does become accessible?
The kind of culture and practices that this new technology could lead to is all just hypothetical for now, but it does raise several questions about what happens to our data – and what we want to happen to it – when the time comes.