Alex ‘Sandy’ Pentland on new data sources and AI

Alex ‘Sandy’ Pentland on new data sources and AI


Reading time: 5 mins

“Google doesn’t know anything”

MIT
Professor, Alex ‘Sandy’ Pentland talks about his pioneering work in wearables
and data protection. He also debunks a few myths about AI and machine learning.

One of the most-cited computational scientists in the world, and recently declared by Forbes as one of the “7 most powerful data scientists in the world” Pentland is regarded as a leading pioneer in wearables. A serial entrepreneur, he is a member of a.o. World Economic Forum (WEF) Councils, Advisory Boards of the UN, and has advised such firms as AT&T, Telefonica, Google and Nissan.

Alex “Sandy” Pentland, Professor at MIT

Author of Social Physics and Honest Signals,
Pentland’s research focuses on social physics, big data, and privacy. His work
involved large-scale experiments with people equipped with mobile devices, such
as a ‘sociometer’, and other technologies that allow behaviour tracking. The
results were then aggregated using big data techniques. He recalls how he first
developed this approach. “We
started 30 years ago. At the time there were no wearables, no cell phones, no wireless
communication, nothing. But it was clear that all of that was coming. As we
wanted to figure out what was going to happen, we built our own wearable
technology from bits and pieces.”

He and his team equipped some 20
students with these revolutionary mobile devices. They wore them continuously
for a year and a half. “We asked: what difference do these technologies make in
somebody’s life? We experimented with our own local versions of instant
messaging and search.” Suddenly there was data of human behaviour in a way that
never existed before. This enabled the team to ask questions like: what groups are
more creative than others? “We could now have a more predictive analysis of
creativity and productivity, because we could prove how creativity spreads in a
group. According to popular belief, influencers spread ideas whereupon others
follow. But the data shows that’s simply not true. It also shows that
advertisers and broadcasters don’t have as much influence as they think.”

So what are people
influenced by? A key answer Pentland found was: by each other. “They’re not necessarily
influenced just by people with similar demographics. Such categories have some
influence but not as much as the group of people that you spend time with. Understanding
people starts with rough theories on how they make decisions in order to be
able to figure out where there ought to be big effects. And then we check it
again with data. You make a hypothesis, you test it, then go back and test it
again. And data is tricky because sometimes you may get a positive response but
you have to be careful because you get these false positives.”

Good gossip

Exploring
an abundance of
different ideas is key to increasing a group’s intelligence, according to Pentland’s research. “Diversity results
in better ideas to choose from. Engagement within the group is also important,
to getting everybody’s input. But what’s really surprising is the extent to
which that engagement and exploration cycle determines the outcome of the
group. In many organisations the groups are so confrontational that they don’t
actually get much input; there are only certain people that talk. And when that
happens, worse decisions are made.” He found that cycle is a better predictor
of group output than factors such as the smartest people within the group, or
the average intelligence. Therefore, our notion of talent is really wrong. “Not
that you want untalented people, but the process of aggregation is much more important.”

The
way ideas spread has in recent years been highlighted by the impact of misinformation, most notably fake
news. Pentland explains why this spreads so rapidly. “It produces adrenaline.
It grabs people’s attention, and it’s a kind of social contribution, like
gossip. Now gossip can occasionally be very good. It can save you if there’s a
tsunami and the news spreads quickly.” In order to squash bad gossip, Pentland
argues, the origin of the information should be looked at carefully. “If it
comes from just one group of people who share the same input, you have to be
very sceptical because then it could just be gossip. But if you see people
endorsing it from many different communities, then it’s much more likely to be
true. Similarly, if things are passed very quickly, they’re less likely to be
true. And if news appeals to a wider group, then it’s more likely to be true,
because it doesn’t have to rely on just adrenaline material.”

Human element

In
the market research and insights sector there are concerns that technological developments
such as AI could lead to more spurious correlations and biased data. Pentland
however is confident that the human element will maintain a strong and
controlling presence. “Clearly,
the human element and human governance are central. Any sort of rule-based
system is not as flexible a sensor; you need to keep humans in the loop.” He
observes that AI is most successful where it supports people in doing people-type
activities, such as search or navigation. “Those are things where people are in
charge. Most AI applications are not making decisions for people, they’re
aggregating data from human behaviour. Google doesn’t know anything; it just adds
up input from searches and connections.”

Indeed,
Pentland stated earlier that without data AI is nothing. He feels that AI and
related technologies, such as machine learning, are often misunderstood and
that most fears are unwarranted. “It’s because the AI doesn’t know anything
about the real world. It’s not qualitatively different from what we’ve had for
many years. It is easier to use and more general, but it doesn’t have any human
common sense. It’s a long way from anything you’d call intelligence. It just
mimics it.” Because of the many misunderstandings, there is a danger that
organisations hand over too much responsibility to AI; something he sees as a
mistake. “Before we deploy these systems, we need to make sure they work the
way we want them to. We must define them by repeated trial and
experimentation.”

Saving lives

On
a positive tech note, Pentland sees developments leading to better data control
and transparency. He co-led the WEF discussion in Davos that led to the EU
privacy regulation GDPR, and he was central in forging the transparency and accountability
mechanisms in the UN’s Sustainable Development Goals. “The data will increasingly be
controlled by individuals themselves. GDPR privacy and similar things are going to be used everywhere. You’re going to see parts of the
current data ecology disappear. And there will be a lot more accountability for the use of AI and
data. AI techniques are already used on digital money, for instance in fraud
detection and auditing. And though it goes wrong sometimes, which people find
scary, we at least track where things go. We’ll just have to do the same thing
with data.”

Pentland’s
research group and entrepreneurship programme has spun off dozens of companies
to date – some with significant impact. “One
of the companies that I helped start, Dimagi, is currently providing medical
information and support to 10% of all the human births in the entire world. It
probably saves around 100,000 lives a year.”



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com