A father finds out his daughter is pregnant after algorithms identify tell-tale patterns in the family’s store card data. Police charge suspects in two separate murder cases based on evidence taken from a Fitbit tracker and a smart water meter. A man sues Uber for revealing his affair to his wife.

Stories such as these have been appearing in ever greater numbers recently, as the technologies involved become ever more integrated into our lives. They form part of the Internet of Things (IoT), the embedding of sensors and internet connections into the fabric of the world around us. Over the last year, these technologies, led by Amazon’s Alexa and Google’s Home, have begun to make their presence felt in our domestic lives, in the form of smart home devices that allow us to control everything in the house just by speaking.

We might look at stories like those above as isolated technical errors, or fortuitous occurrences serving up justice. But behind them, something much bigger is going on: the development of an entire class of technologies seeking to remake the fundamentals of our everyday lives.

Breaking the social order

These technologies want to be ubiquitous, seamlessly spanning the physical and virtual worlds, and awarding us frictionless control over all of it. The smart home promises a future in which largely hidden tech provides us with services before we’ve even realised we want them, using sensors to understand the world around us and navigate it on our behalf. It’s a promise of near limitless reach, and effortless convenience.

It’s also completely incompatible with social realities. The problem is, our lives are full of limits, and nowhere is this better demonstrated than in the family home, which many of these technologies target. From the inside, these places often feel all too chaotic but they’re actually highly ordered. This a world full of boundaries and hierarchies: who gets allowed into which rooms, who gets the TV remote, who secrets are shared with, who they are hidden from.

Much of this is mundane, but if you want to see how important these kind of systems of order are to us, consider the “breaching experiments” of sociologist Harold Garfinkel in the 1960s. Garfinkel set out to deliberately break the rules behind social order in order to reveal them. Conducting the most humdrum interaction in the wrong way was shown to elicit reactions in others that ranged from distress to outright violence. You can try this yourself. When sat round the dinner table try acting entirely normal save for humming loudly every time someone starts speaking, and see how long it is before someone loses their temper.

The technologies of the smart home challenge our orderings in countless small ways. A primary limitation is their inability to recognise boundaries we take for granted. I had my own such experience a week ago while sitting in my front room. With the accidental slip of a finger I streamed a (really rather sweary) YouTube video from my phone onto my neighbour’s TV, much to the surprise of their four-year-old daughter in the middle of watching Paw Patrol.

A finger press was literally all it took, of a button that can’t be disabled. That, and the fact that I have their Wi-Fi password on my phone as I babysit for them from time to time. To current smart home technology, those who share Wi-Fi networks share everything.

Of course, we do still have passwords to at least offer some crude boundaries. And yet smart home technologies excel at creating data that doesn’t fit into the neat, personalised boxes offered by consumer technologies. This interpersonal data concerns groups, not individuals, and smart technologies are currently very stupid when it comes to managing it. Sometimes this manifests itself in humorous ways, like parents finding “big farts” added to their Alexa-generated shopping list. Other times it’s far more consequential, as in the pregnant daughter story above.

In our own research into this phenomena, my colleagues and I have discovered an additional problem. Often, this tech makes mistakes, and if it does so with the wrong piece of data in the wrong context, the results could be disastrous. In one study we carried out, a wife ended up being informed by a digital assistant that her husband had spent his entire work day at a hotel in town. All that had really happened was an algorithm had misinterpreted a dropped GPS signal, but in a relationship with low trust, a suggestion of this kind could be grounds for divorce.


Rejecting the recode

These technologies are, largely unwittingly, attempting to recode some of the most basic patterns of our everyday lives, namely how we live alongside those we are most intimate with. As such, their placement in our homes as consumer products constitute a vast social experiment. If the experience of using them is too challenging to our existing orderings, the likelihood is we will simply come to reject them.

This is what happened with Google Glass, the smart glasses with a camera and heads-up-display built into them. It was just too open to transgressions of our notions of proper behaviour. This discomfort even spawned the pejorative “Glasshole” to describe its users.

Undoubtedly, the tech giants selling these products will continue to tweak them in the hope of avoiding similar outcomes. Yet a fundamental challenge remains: how can technologies that sell themselves on convenience be taught the complexities and nuances of our private worlds? Or rather: how can they do so without needing us to constantly hand-hold them, entirely negating their aim of making our lives easier?

The ConversationTheir current approach – to ride roughshod over the social terrain of the home – is not a sustainable approach. Unless and until the day we have AI systems capable of comprehending human social worlds, it may be that the smart home promised to us ends up being a lot more limited than its backers imagine.

Right now, if you’re taking part in this experiment, the advice must be to proceed with caution – because when it comes to social relationships, the smart home remains pretty dumb.

Oh, and be very careful not to stream things to your neighbour’s TV.

Murray Goulden is a research fellow at the University of Nottingham.

This article was originally published on The Conversation. Read the original article.