Home » 2014 » March (Page 2)

Monthly Archives: March 2014

The challenges and opportunities of automation for the workforce

Dr Carl Frey & Dr Michael Osborne recently made headlines around the world with their Oxford Martin School study – The Future of Employment: How susceptible are jobs to computerization? which showed that nearly half of US Jobs could be at risk of being replaced through automation.

Much is written about how robotics and automation is displacing jobs in the manufacturing industry. Indeed the advanced manufacturing facilities of today and tomorrow are clean and awash with robots, computers, lasers, and other ultramodern machine technologies. The most common tool a production worker carries at the newest auto plants is not a wrench or screwdriver. It’s an iPad.

However little is discussed about the impact of robotics and automation on the financial sector.

The finance sector is now producing record annual profits despite significant staff reductions since 2008.

AIG, the Insurance company, reduced its staff from 116,000 at the end of 2007 to 63,000 by the end of 2012, a reduction of 53,000 people, yet profits recovered during the period of restructuring — despite the massive employee reduction, profits have nudged up slightly from $6.2 billion at the end of 2007 to $6.6 billion at the end of 2012, ($9 billion at the end of 2013). Two other major insurance companies AXA and Allianz have seen their headcounts reduce by a combined 98,566 over the last few years, despite rising profits. This may not be much of a surprise as insurance is increasingly sold online. Consider that in the UK alone some 70% of car insurance is purchased over the Internet, a massive transition in just a few years, which has in part contributed to the demise of so many jobs in insurance sales force and agents.

Major Insurance Co Headcount

Insurance appears to be one sector that is using automated technology to improve productivity, reduce headcount and increase profits.

In the banking sector, Citi Bank, which had 357,000 employees until the global crisis of 2008 reduced its overall headcount by 98,000 to 259,000 by the end of 2012. There is a similar story at Bank of America (Merrill Lynch) with some 50,000 layoffs, whilst almost 25,000 people lost their jobs with the collapse of Lehman brothers.

A couple of weeks ago J.P. Morgan announced a further round of 12,000 – 15,000 job cuts: “the bank is looking to find new savings, partly because of technology that allows greater automation of clerical functions in branches.” Whilst looking online for future growth: “the bank is now looking at revamping its existing branch network with smaller buildings that make better use of new technology and require fewer staff.”

It’s fair to say that perhaps millions of jobs have been lost globally in the financial sector as automation drives efficiencies and both companies and their customers choose the improved services that online technologies offer to transact financial business.

So it would seem that companies are becoming leaner, doing more with less people, whilst maintaining, and indeed increasing, profitability.

The following video, streamed live on March 13th, is a very interesting overview by Dr’s Frey and Osborne on the challenges and opportunities of the automation age.

PS.  I’m not a fan of the “robots will take ALL our jobs” meme, although Bill Gates did say this week that within 20 years, a lot of jobs will go away, replaced by software automation (“bots” in tech slang, though Gates used the term “software substitution”).

Tech companies’ competitive advantage –Bayes Rule and Behavioral Economics

Bayes and his theoryThe ‘system’ behind the Google robotic cars that have driven themselves for hundreds of thousands of miles on the streets of several US states without being involved in an accident, or violating any traffic law, whilst analyzing enormous quantities of data fed to a central onboard computer from radar sensors, cameras and laser-range finders and taking the most optimal, efficient and cost effective route, is built upon the 18th-century math theorem known as Bayes’ rule.

In 1996 Microsoft’s Bill Gates described their competitive advantage as its ‘expertise in Bayesian networks,’ patenting a spam filter in 1998 which relied on Bayes Theorem. Other tech companies quickly followed suit and adapted their systems and programming to include Bayes theorem.

During World War II Alan Turing had used Bayes Theorem to crack the Enigma code, potentially saving millions of lives, and is credited with helping the allied forces victory.

Artificial Intelligence was given a new lease of life when in the early 1980’s Professor Judea Pearl of UCLA’s Computer Science Department and Cognitive System Lab introduced Bayesian networks as a representational device. Pearl’s work showed that Bayesian Networks constitute one of the most influential advances in Artificial Intelligence, with applications in a wide range of domains.

Bayes Theorem is based on the work of Thomas Bayes as a solution to a problem of inverse probability. It was presented in “An Essay towards solving a Problem in the Doctrine of Chances” read to the Royal Society in 1763 after Bayes’ death (he died in 1761). Put simply Bayes rule is a mathematical relationship between probabilities, which allows the probabilities to be updated in light of new information.

Before the advent of increased computer power Bayes Theorem was overlooked by most statisticians, scientists and in most industries. Today, thanks to Professor Pearl, Bayes Theorem is used in robotics, artificial intelligence, machine learning, reinforcement learning and big data mining.  IBM’s Watson, perhaps the most well known AI system, in all its intricacies, ultimately relies on the deceivingly simple concept of Bayes’ Rule in negotiating the semantic complexities of natural language.

Bayes Theorem is frequently behind the technology development of many of the multi-billion dollar acquisitions we read about, and certainly a core piece of technology behind the billions in profits at leading tech companies, from Google’s search to LinkedIN, Netflix’s and Amazon’s recommendation engines, and will play an even more important role in future developments within automation, robotics and big data.

Professor Pearl, through his work in the Cognitive System Lab, recognized the problems of human psychology in software development and representation. In 1984 he published a book simply called Heuristics (Intelligent Search Strategies for Computer Problem Solving).

Pearl’s book relied on research by the founder of Behavioral Economics Daniel Kahneman and Amos Tversky and particularly their work with Paul Slovic: Judgment under Uncertainty: Heuristics and Biases. Cambridge University Press, 1982, where they confirmed their own reliance on Bayes Theorem:

Ch.25: Conservatism in human information processing: “Probabilities quantify uncertainty. A probability, according to Bayesians like ourselves; is simply a number between zero and one that represents the extent to which a somewhat idealized person believes a statement to be true…. Since such probabilities describe the person who holds the opinion more than the event the opinion is about, they are called personal probabilities.” (Page 359)

Kahneman (Nobel Prize in Economics) and Tversky showed Bayesian methods more closely reflect how humans perceive their environment, respond to new information, and make decisions.  The theorem is a landmark of logical reasoning and the first serious triumph of statistical inference; Bayesian methods interpret probability as the degree of plausibility of a statement.

Kahneman and Tversky especially highlighted the heuristics and biases where Bayes Rule can overcome our irrational decision-making and this is why so many of the tech companies are seeking to train their engineers and programming staff with behavioral economics knowledge. We use the availability heuristic to assess probabilities rather than Bayesian equations. We all know that this gives way to all sorts of judgmental errors: a belief in the law of small numbers and a tendency towards hindsight bias. We know that we anchor around irrelevant information and that we take too much comfort in ever-more information that seems to provide us confirmation of our beliefs.

The representativeness heuristic

Heuristics are described as “judgmental shortcuts that generally get us where we need to go – and quickly – but at the cost of occasionally sending us off course.

When people rely on representativeness to make judgments, they are likely to judge wrongly because the fact that something is more representative does not make it more likely. This heuristic is used because it is an easy computation (Think Zipf’s law and human behavior – the principle of least effort). The problem is that people overestimate their ability to accurately predict the likelihood of an event. Thus it can result in neglect of relevant base rates (base rate fallacy) and other cognitive biases, especially confirmation bias.

The base rate fallacy describes how people do not take the base rate of an event into account when solving probability problems and is frequently and error in thinking.

Confirmation bias

Confirmation bias is the tendency of people to favor information that confirms their beliefs or hypotheses. Essentially people are prone to misperceive new incoming information as supporting their current beliefs.

It has been found that experts reassess data selectively, depending on their prior hypotheses over time. Bayesian statisticians argue that Bayes’ s theorem is a formally optimal rule about how to revise opinions in the light of evidence. Nevertheless, Bayesian techniques are, so far rarely utilized by management researchers or business practitioners in the wider business world.

Eliezer Yudkowsky of the Machine Intelligence Research Institute has written a detailed introduction of Bayes Theorem using behavioral economics examples and machine learning, which I highly recommend.

Time to think Bayesian and Behavioral Economics

As the major tech companies are showing, Bayesian and Behavioral Economics methods are well suited to address the increasingly complex phenomena and problems faced by 21st-century researchers and organizations, where very complex data abound and the validity of knowledge and methods are often seen as contextually driven and constructed.

Bayesian methods that treat probability as a measure of uncertainty may be a more natural approach to some high-impact management decisions, such as strategy formation, portfolio management, and decisions whether or not to enter risky markets.

If you are not thinking like a Bayesian, perhaps you should be.