Our office neighborhood lost power for about an hour this morning; when it came back on, and we'd restored internet connections and so on, about the first thing that came up was an article on enhancing one's security on one's various interconnected electronic devices around the home. Good advice on avoiding WY-fi, updating devices often, generating one's own passwords, and so on.
I've not seen much evidence of such interconnectivity in the sort of client we serve - smaller commercial janitorial jobs in the metro Phoenix area. But, reading the trade press, I'm sure it's coming.
Going back to our short power outage: Whenever this happens, I'm reminded anew of how much I depend on the electronic gizmo at which I currently sit. There's very little I can accomplish around the office without it, except perhaps re-painting. The over-riding problem with interconnectivity, as I see it, is that when one aspect malfunctions - through a glitch, a hacker (North Korea or the kid next door), or an electrical issue - everything is likely to go down. In an office building, if you tie together door access, alarm system, security cameras, the web and so on, you add convenience and ability to your system, but also complexity.
Folks who study disasters and accidents distinguish between complicated systems (those that have a lot of moving parts) and complex systems (those that have moving parts that interconnect in ways the results of which are often difficult to predict). For a complicated system, think pre-flighting an airliner. You have a whole long check list, but should you miss something - say setting the brakes - you know pretty precisely what will happen. Should you miss several things, again, you know what will happen. The couple of nuclear power plant disasters over the years illustrate the issue with a complex system: A couple or three small things change or go wrong, and you've no idea what might happen. Kind of like the weather (the butterfly flapping it's wings...).
One might go a step further. My (currently) favorite archaeologist argues that building a society is simply a matter of adding complexity (Tainter - "The Collapse of Complex Societies"). You have an issue - say a famine in a Roman province - you add infrastructure (physical or bureaucratic) to fix it - say a system to ship food into the province. He argues that complexity adds overhead, so the society grows less and less productive, thus saves less resources ongoing, and eventually has no resilience to the next outrage, whether it's a plague or a crop failure or the barbarians at the gates. Thus in Tainter's magisterial survey, societies fail for a variety of proximate causes, but most always for underlying economic weakness. I'd add to the economic analysis the complexity caused unpredictably of outcomes. And, we're certainly creating an ever more complex society.
In other words, "Keep it Simple, Stupid".