Technology: We can’t live without it. So, we better learn how to live with it, particularly since the proliferation of “smart” devices is kick starting yet another profound technological transformation.
The ongoing merger of the previously isolated computational and physical worlds —a.k.a. the Internet of Things in the U.S. and Industrie 4.0 in Europe—has begun impacting many aspects of society and virtually all sectors of industry.
Today, there is a window of opportunity for us to come up with better ways to manage these new generations of ‘cyber-physical’ systems that must co-exist with the norms of different societies in ways that minimize the social disruption that they create.
“Today, there is a window of opportunity for us to come up with better ways to manage these new generations of ‘cyber-physical’ systems that must co-exist with the norms of different societies in ways that minimize the social disruption that they create,” said Janos Sztipanovits, E. Bronson Ingram Distinguished Professor of Engineering at Vanderbilt University.
Sztipanovits is the principal investigator on an ambitious new international, interdisciplinary project to develop and test the concept of incorporating social norms, policies and values into the basic architecture of these new generations of systems. Titled “Science of Design for Societal-Scale Cyber-Physical Systems,” the project has just received a $4 million, five-year grant from the National Science Foundation. Co-principal investigators for the project are Shankar Sastry at the University of California, Berkeley, Alexander Pretschner at the Technical University of Munich and Werner Damm at the University of Oldenburg.
“The fusion between people, computing and the physical environment is becoming so deep that it is getting harder and harder to tell them apart,” Sztipanovits observed. “The synergistic interaction among these components is changing how we live. For example, the future of mobility is being determined by companies like Uber and Lyft and will be transformed by self-driving cars. It is not surprising that societal tensions are developing because these new technologies are having massive social impacts with potentially conflicting social expectations and policies.”
The potential economic rewards are enormous. According to a 2015 study by the McKinsey Global Institute, the ongoing digitization of industry could add as much as $1.5 trillion to the U.S. Gross Domestic Product and €1 trillion to the GDP of Europe.
Conflicts arise because there will be winners and losers. An example is the route planning that the GPS navigation systems in automobiles make possible. The individual drivers are winners because they reduce their commute times. Society, as a whole, wins because the system helps balance overall traffic flows. But the losers are the people who live on the quiet neighborhood streets that experience increased levels of traffic as a result.
“Why does Google have the right to push more traffic through their neighborhoods?” Sztipanovits asked. “These people have two basic options: Shut up and tolerate the situation or require Google to adapt its algorithms to take their rights into account. How do we resolve conflicts like this?”
Historically, the way that societies have dealt with the problems caused by new technologies is through laws and regulations.
Take the case of the automobile. In 1865 Britain passed a law that set a speed limit on steam carriages of two miles per hour in cities and required that they must be proceeded by a crewman carrying a red warning flag. In the U.S problems with speeding, reckless driving, collisions and pedestrian casualties in the 1910’s prompted officials to attempt to control driver behavior through laws, fines, traffic signals and drunk driving arrests. In the 1920’s customer demand compelled manufacturers to begin introducing safety improvements such as all-steel bodies and hydraulic brakes. In the 1950’s, university scientists invented crash testing, which led to the adoption of safety features such as seat belts and padded dashboards. Increased environmental awareness in the 1960’s produced emissions regulations and the energy crisis in the 1970’s brought about the adoption of fleet mileage requirements.
Today, the advent of autonomous automobiles is raising a fundamentally new issue: Should we allow computer systems to make the kind of life and death decisions that can arise when driving?
“In the US, this question is being openly debated and has not yet been decided. In Germany, however, it has been judged to be unconstitutional and will not be allowed,” Sztipanovits reported.
This illustrates the fact that different societies have very different social norms. Another example is how government and big business are viewed in the United States and Europe. In the US, people are much more suspicious of government than they are of big business. In Europe, it is just the reverse: people generally view big companies as evil and government as benign.
“Unless we adopt a new approach, these different social norms will be hardwired into the evolving systems,” said Sztipanovits. This balkanization will make it difficult, if not impossible, for the different systems to work together and it is likely to narrow the markets for related products and services.
According to Sztipanovits, creating a socially adaptive “internet of things” should yield significant benefits for consumers by providing them with levels of privacy, security and safety in line with their expectations. At the same time, it should allow large companies to build single systems that can be fine-tuned for individual cultures—as opposed to being forced to produce a number of customized versions — a prospect that has generated “very strong interest” on the part of a number large tech companies, he said.
Creating a socially adaptive “internet of things” should yield significant benefits for consumers by providing them with levels of privacy, security and safety in line with their expectations.
Engineers and social scientists have paid considerable attention to the impacts of new technology on social policies but have given very little consideration to the proposition of making cyber-physical systems that can adapt to them.
“If we can model these social policies in mathematical terms, we can incorporate them into the new cyber-physical systems as parameters which will ensure that they will operate in a socially acceptable fashion in different countries,” Sztipanovits argued.
The new project brings together experts in the fields of sociology, cognitive psychology, law, computer science and engineering to develop algorithms that represent a range of social factors, including market forces, privacy, security and safety, that can be incorporated into the underlying structure of the new systems.
The collaborators will be developing prototype policy-aware systems in three different domains: connected vehicles, including self-driving cars; smart grid energy systems; and unmanned aerial vehicles.
They have identified four technical approaches for achieving this goal:
- Incentive engineering: Using game theory to model the incentives of all the parties involved—end-users, service providers, third-party providers, regulators and adversarial agents—and determine how these incentives can be shaped to support fairness, security, privacy and other social values;
- On-line conflict resolution: Explore methods for dynamically shifting decision making from machines to humans when conflicts arise that require human judgement;
- Policy-aware system synthesis: Formalize guidelines required to design systems that can enforce end-to-end data security policies that safeguard people’s privacy while allowing them to access and manage their own information;
- Policy auditing: Design a monitoring system that can detect, observe and document and analyze safety and security failures so that legal liability can be determined and systems can be continuously strengthened and upgraded.
The project is funded by National Science Foundation grant NSF PIRE 1743772.