As part of my MBA course in business analytics we run a NewsHound forum where students post relevant stories about analytics that emerge as the course is running. The big story of 2017/8 has been artificial intelligence (AI). Some useful resources for managers have been identified during the course:
- McKinsey have published An Executive’s Guide to AI, which provides history, definitions, and use cases for AI. The guide distinguishes between machine learning and deep learning and provides use cases that are insightful illustrations of how AI can be used. The guide is also packaged in an accessible interactive format.
- Davenport and Ronanki’s (2018) article, Artificial Intelligence for the Real World, in HBR is an excellent guide to the management challenges and opportunities of AI. The focus on business benefits and implementation strategies makes it highly relevant to managers.
- A longer read on AI is provided in McKinsey Global Institute’s (2017) report Artificial Intelligence: the next digital frontier. A good place to start with this report is with the five case studies, which cover retail, utilities, manufacturing, healthcare, and education.
It’s also worth a look at the April 2018 report “AI in the UK: ready, willing and able?” produced by the House of Lords Select Committee on Artificial Intelligence – this provides an in-depth look at the implications of AI for industry and society.
As part of our MBA course on business analytics we ran a weekend workshop investigating the use of analytics for GoGet, a car-sharing company. MBA students were organized into four teams of 5 or 6 members. The method draws on systems thinking, business model mapping, and design thinking. Outputs from the workshop were hand-drawn using a mix of flip-chart paper, sharpies, and post it notes – as in the rich picture below.
GoGet car-sharing rich picture
The workshop was structured around the following steps:
- background investigation (e.g., Web site, news, social media) of the situation, which is represented using a rich picture
- business model mapping using the business model canvas
- analytics leverage analysis – analytics opportunities are identified and rated according to difficulty and value creation potential
- each team then selected a single analytics opportunity to develop and created an opportunity canvas to elaborate on the benefits of the proposed development
- the teams then formulated their opportunity as a design challenge and ideated possible solutions
- personas relevant to their analytics application were developed (for customers and internal users, as appropriate)
- a storyboard was created to show how the analytics application is blended with business processes to create value for the case company
- a prototype of the user interface might be incorporated into the storyboard or added as a further deliverable.
In a day and a half each group generated a comprehensive analysis targeting different business analytics applications. The key to this approach is to focus on problems/opportunities and how business value is to be created. The same model (e.g., to predict which customers might have risky driving behaviours) can be blended with different business initiatives (e.g., increase collision excess, vary rates, implement telematics, request that a customer undergoes training) to create a business analytics value innovation project.
Drawing on a diverse and rich set of outputs each team made a five minute pitch to GoGet in which they were able to tell a compelling business value creation story. To do this they focused on the opportunity canvas, the personas, and the storyboard. The video shows the abundance, richness, and depth of thinking produced in the workshop.
Ever wondered how well your organization is doing at business analytics across the board? The business analytics capability assessment (BACA) instrument was developed as part of the research into the management challenges of business analytics (Vidgen et al., 2017).
BACA is available in Qualtrix, an online survey package. You can view and complete the survey here. On completion of the survey you will see a report of all the responses received to date. No identifying data is collected and thus both the organizations surveyed and the respondents are fully anonymous.
We use the survey to identify areas for managers to focus their attention on in developing a data and evidence-based culture.
The survey results can be summarized in a radar chart:
The computational literature review package (clR) is an open source offering, developed in the statistical programming language R. The code has been redeveloped and is now easier to use and greatly more efficient.
The CLR automates analysis of Scopus research articles with analyses of impact (citations), structure (co-authorship networks) and content (topic modeling of abstracts). The CLR software can be used to support three use cases: (1) analysis of the literature for a research area, (2) analysis and ranking of journals, and (3) analysis and ranking of individual scholars and research teams. We are working on adding Web of Science data.
To install the clR package go to GitHub and download vignette.R where instructions on installation and execution of clR are provided.
For further details of the CLR approach see:
Mortenson, M., and Vidgen, R., (2016). A computational literature review of the technology acceptance model. International Journal of Information Management. 36: 1248 – 1259.
This article was published in the Observer 26 February 2017:
Robert Mercer: the big data billionaire waging war on mainstream media
The article shows how big data and AI/machine learning can be used not just to track sentiment on social media but also how it might have been used to shape sentiment in the Trump and Brexit campaigns. I’ve taken three quotes from what is a long article (I hope I haven’t done it too much of a violence) to show big data analytics at work. The steps are the same as usual – there’s just more data, smarter AI, and more at stake:
First, get lots of data
“On its website, Cambridge Analytica makes the astonishing boast that it has psychological profiles based on 5,000 separate pieces of data on 220 million American voters – its USP is to use this data to understand people’s deepest emotions and then target them accordingly.”
Second, do some predictive modelling
“These Facebook profiles – especially people’s “likes” – could be correlated across millions of others to produce uncannily accurate results. Michal Kosinski, the centre’s lead scientist, found that with knowledge of 150 likes, their model could predict someone’s personality better than their spouse. With 300, it understood you better than yourself. “Computers see us in a more robust way than we see ourselves,” says Kosinski.”
Third, get actionable insight
“He suspects that Mercer is bringing the brilliant computational skills he brought to finance to bear on another very different sphere. “We make mathematical models of the financial markets which are probability models, and from those we try and make predictions. What I suspect Cambridge Analytica do is that they build probability models of how people vote. And then they look at what they can do to influence that.””
A literature review is a central part of any research project, allowing the existing research to be mapped and new research questions to be asked. However, due to the limitations of human data processing, the literature review can suffer from an inability to handle large volumes of research articles. The computational literature review (CLR) automates the analysis of research articles with analyses of:
- impact (citation analysis, e.g., H-index)
- structure (co-authorship social network analysis)
- content (topic modeling of article abstracts)
The CLR software can be used to support three use cases: (1) analysis of the literature for a research area, (2) analysis and ranking of journals, and (3) analysis and ranking of individual scholars and research teams.
The CLR and is explained and illustrated using a set of 3,386 articles related to the technology acceptance model (TAM) in:
Mortenson, M., and Vidgen, R., (2016). A computational literature review of the technology acceptance model. International Journal of Information Management, 36: 1248 – 1259
The CLR is an open source offering, developed in the statistical programming language R, and made freely available to researchers to use and develop further.The code for the CLR is available from GitHub.
All organizations have limited resources and have to be mindful of where their time, money, people, and attention are focused. Without a clear business analytics strategy – which must be aligned with the organization’s business strategy and business model – it is unlikely that the potential of business analytics will be achieved (indeed, much time and money are likely to be wasted).
We have been working on a way of developing a business analytics strategy that is aligned with the business strategy and business model, i.e., the creation of a portfolio of analytics developments that will add value to an organization or focal business unit.
AnVIM uses a combination of the business model canvas (BMC), developed by Osterwalder and Peigneur, and systems thinking to provide context and depth to the business model. This analysis is followed by a mapping from business model to analytics opportunities:
AnVIM has been presented and workshopped at Operational Research conferences over the last two years and we are looking for collaborators who would like to experiment with the approach and work with us to develop it further.
Find out more about AnVIM in our white paper.