The textbook, ‘Business Analytics: a management approach’ by Vidgen, Kirshner and Tan has been published.
The book offers an accessible, business-focused overview of the key theoretical concepts underpinning modern data analytics. It provides engaging and practical advice on using the key software tools, including SAS Visual Analytics, R and DataRobot, that are used in organisations to help make effective data-driven decisions. Combining theory with hands-on practical examples, this essential text includes cutting edge coverage of new areas of interest including social media analytics, design thinking and the ethical implications of using big data. A wealth of learning features including exercises, cases, online resources and data sets help students to develop analytic problem-solving skills.
The publisher’s companion site contains materials for instructors (e.g., lecture slides) and the support site maintained by the authors has resources for everyone (e.g., datasets used in the book, a chapter on Python, and a blog for latest news).
To order a hard copy or an e-copy of the book go to the Macmillan site (also available on VitalSource, AmazonUK, AmazonUS, and AmazonAus).
Business analytics is playing a greater and greater role in our daily lives, impacting on job applications, medical treatment, parole eligibility, and loans and financial services. There are undoubted benefits to algorithmic decision-making in general, and artificial intelligence (AI) in particular. The potential for harm – intended or unintended – arising from algorithmic decision-making indicates that an ethical dimension is needed when we engage in analytics projects.
Much of the guidance on ethical investigations for practitioners has been rather abstract. In our paper in the European Journal of Operational Research (2019), “Exploring the ethical implications of business analytics with a business ethics canvas“, we propose a practical way of unpacking and thinking about ethical issues in business analytics. Our aim in this paper is to develop an ethical framework in the form of a business canvas. To do this we draw on the Markkula Institute for Applied Ethics and their five ethical dimensions (Utilitarian, Rights, Justice, Common Good, Virtue).
Using these five ethical dimensions we created a business ethics canvas (a PowerPoint template is here). The canvas elements, which are addressed going in clockwise order around the canvas, are:
- A proposed analytics solution that address the needs of specific customers (we recommend that these are generated in an opportunity canvas)
- Identification of stakeholders that can affect or are affected by the proposed analytics solution
- An assessment of stakeholder utility of the analytics solution
- An assessment of the rights of stakeholders
- An assessment of the fairness (justice) of the solution
- Implications for the common good
- Reflections on the virtue of the proposed analytics solution
In the paper we describe how the business ethics canvas was developed through application in an online travel organization that is considering using analytics to target its customers with ‘days out’ offers. The use of post-it notes allows the canvas to be iterated and refined and the use of colour to highlight areas of concern and areas of opportunity.
Our research shows that ethical analysis should not be seen as a constraint or overhead to analytics development – exploring the ethical dimension and including multiple stakeholders provides a richer insight into business value creation, as well as providing greater confidence about emerging ethical implications and project risk. Our research proposes that we should position the business ethics canvas (BEC) and the opportunity canvas (OC) as counterparts where each shapes and informs the other in a creative tension.
See Ian Randolph at the Operational Research Society’s 2018 Annual Analytics Summit presenting the Business Ethics Canvas.
As part of my MBA course in business analytics we run a NewsHound forum where students post relevant stories about analytics that emerge as the course is running. The big story of 2017/8 has been artificial intelligence (AI). Some useful resources for managers have been identified during the course:
- McKinsey have published An Executive’s Guide to AI, which provides history, definitions, and use cases for AI. The guide distinguishes between machine learning and deep learning and provides use cases that are insightful illustrations of how AI can be used. The guide is also packaged in an accessible interactive format.
- Davenport and Ronanki’s (2018) article, Artificial Intelligence for the Real World, in HBR is an excellent guide to the management challenges and opportunities of AI. The focus on business benefits and implementation strategies makes it highly relevant to managers.
- A longer read on AI is provided in McKinsey Global Institute’s (2017) report Artificial Intelligence: the next digital frontier. A good place to start with this report is with the five case studies, which cover retail, utilities, manufacturing, healthcare, and education.
It’s also worth a look at the April 2018 report “AI in the UK: ready, willing and able?” produced by the House of Lords Select Committee on Artificial Intelligence – this provides an in-depth look at the implications of AI for industry and society.
As part of our MBA course on business analytics we ran a weekend workshop investigating the use of analytics for GoGet, a car-sharing company. MBA students were organized into four teams of 5 or 6 members. The method draws on systems thinking, business model mapping, and design thinking. Outputs from the workshop were hand-drawn using a mix of flip-chart paper, sharpies, and post it notes – as in the rich picture below.
GoGet car-sharing rich picture
The workshop was structured around the following steps:
- background investigation (e.g., Web site, news, social media) of the situation, which is represented using a rich picture
- business model mapping using the business model canvas
- analytics leverage analysis – analytics opportunities are identified and rated according to difficulty and value creation potential
- each team then selected a single analytics opportunity to develop and created an opportunity canvas to elaborate on the benefits of the proposed development
- the teams then formulated their opportunity as a design challenge and ideated possible solutions
- personas relevant to their analytics application were developed (for customers and internal users, as appropriate)
- a storyboard was created to show how the analytics application is blended with business processes to create value for the case company
- a prototype of the user interface might be incorporated into the storyboard or added as a further deliverable.
In a day and a half each group generated a comprehensive analysis targeting different business analytics applications. The key to this approach is to focus on problems/opportunities and how business value is to be created. The same model (e.g., to predict which customers might have risky driving behaviours) can be blended with different business initiatives (e.g., increase collision excess, vary rates, implement telematics, request that a customer undergoes training) to create a business analytics value innovation project.
Drawing on a diverse and rich set of outputs each team made a five minute pitch to GoGet in which they were able to tell a compelling business value creation story. To do this they focused on the opportunity canvas, the personas, and the storyboard. The video shows the abundance, richness, and depth of thinking produced in the workshop.
Ever wondered how well your organization is doing at business analytics across the board? The business analytics capability assessment (BACA) instrument was developed as part of the research into the management challenges of business analytics (Vidgen et al., 2017).
BACA is available in Qualtrix, an online survey package. You can view and complete the survey here. On completion of the survey you will see a report of all the responses received to date. No identifying data is collected and thus both the organizations surveyed and the respondents are fully anonymous.
We use the survey to identify areas for managers to focus their attention on in developing a data and evidence-based culture.
The survey results can be summarized in a radar chart:
The computational literature review package (clR) is an open source offering, developed in the statistical programming language R. The code has been redeveloped and is now easier to use and greatly more efficient.
The CLR automates analysis of Scopus research articles with analyses of impact (citations), structure (co-authorship networks) and content (topic modeling of abstracts). The CLR software can be used to support three use cases: (1) analysis of the literature for a research area, (2) analysis and ranking of journals, and (3) analysis and ranking of individual scholars and research teams. We are working on adding Web of Science data.
To install the clR package go to GitHub and download vignette.R where instructions on installation and execution of clR are provided.
For further details of the CLR approach see:
Mortenson, M., and Vidgen, R., (2016). A computational literature review of the technology acceptance model. International Journal of Information Management. 36: 1248 – 1259.
This article was published in the Observer 26 February 2017:
Robert Mercer: the big data billionaire waging war on mainstream media
The article shows how big data and AI/machine learning can be used not just to track sentiment on social media but also how it might have been used to shape sentiment in the Trump and Brexit campaigns. I’ve taken three quotes from what is a long article (I hope I haven’t done it too much of a violence) to show big data analytics at work. The steps are the same as usual – there’s just more data, smarter AI, and more at stake:
First, get lots of data
“On its website, Cambridge Analytica makes the astonishing boast that it has psychological profiles based on 5,000 separate pieces of data on 220 million American voters – its USP is to use this data to understand people’s deepest emotions and then target them accordingly.”
Second, do some predictive modelling
“These Facebook profiles – especially people’s “likes” – could be correlated across millions of others to produce uncannily accurate results. Michal Kosinski, the centre’s lead scientist, found that with knowledge of 150 likes, their model could predict someone’s personality better than their spouse. With 300, it understood you better than yourself. “Computers see us in a more robust way than we see ourselves,” says Kosinski.”
Third, get actionable insight
“He suspects that Mercer is bringing the brilliant computational skills he brought to finance to bear on another very different sphere. “We make mathematical models of the financial markets which are probability models, and from those we try and make predictions. What I suspect Cambridge Analytica do is that they build probability models of how people vote. And then they look at what they can do to influence that.””
A literature review is a central part of any research project, allowing the existing research to be mapped and new research questions to be asked. However, due to the limitations of human data processing, the literature review can suffer from an inability to handle large volumes of research articles. The computational literature review (CLR) automates the analysis of research articles with analyses of:
- impact (citation analysis, e.g., H-index)
- structure (co-authorship social network analysis)
- content (topic modeling of article abstracts)
The CLR software can be used to support three use cases: (1) analysis of the literature for a research area, (2) analysis and ranking of journals, and (3) analysis and ranking of individual scholars and research teams.
The CLR and is explained and illustrated using a set of 3,386 articles related to the technology acceptance model (TAM) in:
Mortenson, M., and Vidgen, R., (2016). A computational literature review of the technology acceptance model. International Journal of Information Management, 36: 1248 – 1259
The CLR is an open source offering, developed in the statistical programming language R, and made freely available to researchers to use and develop further.The code for the CLR is available from GitHub.
All organizations have limited resources and have to be mindful of where their time, money, people, and attention are focused. Without a clear business analytics strategy – which must be aligned with the organization’s business strategy and business model – it is unlikely that the potential of business analytics will be achieved (indeed, much time and money are likely to be wasted).
We have been working on a way of developing a business analytics strategy that is aligned with the business strategy and business model, i.e., the creation of a portfolio of analytics developments that will add value to an organization or focal business unit.
AnVIM uses a combination of the business model canvas (BMC), developed by Osterwalder and Peigneur, and systems thinking to provide context and depth to the business model. This analysis is followed by a mapping from business model to analytics opportunities:
AnVIM has been presented and workshopped at Operational Research conferences over the last two years and we are looking for collaborators who would like to experiment with the approach and work with us to develop it further.
Find out more about AnVIM in our white paper.
On Tuesday 21 June 2016 the Operational Research Society’s Annual Analytics Summit takes place with morning presentations from Marks and Spencer, Movement Strategies, the Department for Education, and the Trussell Trust. The plenary talk is by Megan Lucero, Data Journalism Editor at The Times & Sunday Times. In the afternoon we are holding workshops to go deeper into the technology solutions reported in the morning sessions. We will be presenting the geospatial app built for the Trussell Trust.