Part 3 of this 4 part series will describe the importance of Intelligent Workflows in The Cognitive Enterprise.
- The Cognitive Enterprise (1 of 4)
- Market Making Business Platforms (2 of 4)
- Intelligent Workflows (3 of 4)
- Enterprise Experience and Humanity (4 of 4)
The success of a business platform rest upon the intelligence and innovation of the workflows that support them. With the power of exponential technologies (automation, blockchain, IoT, 5G, AI) and the greater data that these deliver, the higher expectations of today's customer can be met. Building the skills needed to build the Intelligent Workflows that are needed to deliver on this will become the core focus of the agile teams in organisations.
In fact it's not only the customer who stands to gain, entire ecosystems (the organisation's business partners, suppliers and its own employees) value the seamless flow of real-time data. This can lead to new business models as just-in-time and predictive modelling open new possibilities.
Groupama Assicurazioni worked with IBM to produce an Internet of Things (IoT) telematic solution. The solution collects data from motor driving behaviour and combines it with other data such as weather conditions, traffic data, and a customer's own claim history.
Intelligent workflows improves the quality and rapidity of decision making to support Straight Through Processing (STP). As well as customer care improvements, there have been cost reductions by as much as 45%.
Building on the concept of iteration, customers are being changed from payers into partners safer driving habits can be coached. Smart pricing is also something that more insurers will introduce in order to price according to actual risk.
There are three Action Areas which are seen as critical to becoming cognitive:
- Embed exponential technologies - The use of exponential technologies to build dynamic and intelligent workflows that change working practices and new experiences are created. Teams can work with greater autonomy and productivity.
- Drive value from data - Mine the most important data and gain insights of ever greater value. Robust governance should engender trust in the data and AI models so that decision making can be more distributed.
- Deploy through hybrid multi-cloud - Adopt hybrid cloud to access data and leverage it for intelligent workflows in a de-risked way.
1) Embed exponential technologies
Workflow as the sensory system
Cognitive enterprises seeking to exploit market making Business Platforms use intelligent workflows that transform how work is done. Intelligent workflows span the organisation and can even cross organisations. To create new value, organisations should first identify and target the workflows that will underpin platform success.
When the value creating workflows are identified, processes should be redesigned, ideally from a clean sheet. With the use of exponential technologies three things can change:
- People: employees, ecosystem partners, and customers
- Processes: cross silo and create new value
- Data: data driven mentality to drive decisions, learning, and automation
To re-imagine the workflows, organisations might consider how processes can be linked, from starting point to outcome, and then add intelligence.
What is the data that is needed to provide guidance for decision making and predictive capability? Can it come from partners in the ecosystem? In this way, workflows act like a sensory system and not just a mechanical ‘do process A followed by process B’.
The move from process redesign to intelligent workflows
Starting small, the building of intelligent workflows start with single processes building to multi-process and multi-functional workflows.
- Technology enabled process improvement - A tentative approach may begin with automating insurance claims, identifying anomalies and drive next actions.
- Multi-process or single function workflows - By linking the claims process to the payment process, an end to end data flow.
- Multi-function or platform workflows - By extending the workflow, for example to underwriting and customer profiling, bundling opportunities may arise.
2) Drive value from data
New avenues of opportunity
It may be hard to put a number on the value of data but without it, intelligent workflows and the value that they create will not be possible.
The creation of data is increasing at a phenomenal rate, particularly as more organisations adopt IoT. The conversion of this into insights is what is driving innovation.
However, with the amount of data that is created, from all of the different sources that it may come from, the data must:
- Have focus / make a difference
- Be readily available and accessible
- Be trustworthy (be a single source of truth)
The big incumbents, already awash in data, are combining this with external data sources to form new pools of value. Data from car telemetry, trains, aircraft etc anticipate organisation and customer needs and intelligent workflows can determine next steps.
The automation of workflows as opposed to the automation of repetitive tasks, as per Robotic Process Automation (RPA).
The Gestalt of Platform and data
The value of data multiplies when it is combined with more and more sets of data. One question that organisations have is how to get this additional data?
Data can be shared with partners or made freely available but giving away one’s own proprietary data comes with a potential cost. Organisations need to determine whether the data that they have can be shared and need to consider the following:
- Does keeping the data private create long-term, superior advantage?
- Does keeping the data private create only a short-term advantage?
- Would sharing the data lead to new combinations of data that can lead to superior long-term advantage?
Some supermarkets in the UK have even established open APIs to product data. Governments make available crime and other national data. All of this can be combined and result in value that is greater than the sum of its data parts.
Consumer protection and data scarcity
Imagine combining proprietary data with shared or publicly available data, creating value that gives competitive advantage and then having the externally sourced data made unavailable.
That is exactly the scenario which is facing organisations. Digital trails are disappearing as customers refuse consent for cookies. Data already held by organisations is susceptible to being deleted on demand. Regulation and greater personal awareness of data may result in externally (and internally) held data being denied.
While organisations have limited influence on the regulatory landscape, they have far more control over how they engender customer trust – and the more trust that customers have, the less chance there will be of them asking to purge it.
Data readiness, availability, and accessibility
To build intelligent workflows, and for employees, or the “citizen data scientists” to derive value, data must be accurate and clean (i.e. acceptable formats, characters, data type etc). The data scientists who make the data available and correctly formatted spend most of their time doing this before they can do the ‘interesting’ modelling – 80% by some estimates.
Organisations recognise that is no longer sufficient just to dump all its data into an unregulated data lake. Instead, cleansing, validation and selection must first take place before it’s put into a data lake and made readily accessible by permissioning to the right people.
Updating skills and tools
To enable a culture of data, people will need to be comfortable with using data. Training can be given, but modern tools will also need to be provided. Real-time data visualisation tools give the “citizen data scientist” the ability to dig into data and derive their own insights, thereby removing the bottleneck when organisations channel new report or data requests through a few highly knowledgeable or skilled people, and turning the turnaround time to questions from weeks to hours.
Ethics and trusting AI
Decisions derived by AI must be transparent and be traceable back to the logic that delivered it. Answers delivered at by AI are only as good as the data that is used. Bias in the data must be constantly fought against and privacy matters must always be respected – for example a device that responds to voice must constantly delete the data so that conversations are not recorded. A data code of ethics is needed.
Data is a two way deal
Customers are far more data savvy than they used to be. They are far less willing to allow their information to be used and when it is, they have very high expectations of what it can be used for, and by whom. Simultaneously they demand a lot more data as well. There are three guiding principles to satisfy trust and access to data:
- Transparency - Data influences purchasing decisions and needs to be available. Partners in a supply chain rely upon it especially for advanced capabilities like automatic Just-In-Time (JIT) replenishment. Blockchains confer transparency and trace products from source. Olive oil is a commodity that is vulnerable to mis-labelling and blockchains can be used to ensure the purity of the commodity as it moves through the supply chain.
- Reciprocity - If organisations want to secure access to particular data, they need to be able to provide something useful in return.
- Accountability - Commitments to data security and brand trust are key to accessing more data.
3) Deploy through hybrid multi-cloud
Organisations that operate on multiple clouds may find it becomes harder to move data from one to the next as new business partners and services are added, or changes made. To circumvent this, a hybrid multi-cloud infrastructure will link on premise IT, public and private multi-clouds so that data and workflows can move effortlessly between them, all aligned under common policies covering security, compliance, and governance. Without this, the intelligent workflows that we seek would be fatally stalled.
The platforms that organisations use will likely be a mixture of their own which they operate, and others that they participate in. Workflows will need to span ecosystems and not just a single enterprise. Clouds will have to be interoperable with the clouds of other organisations.
Legacy and new applications will need to be evaluated on whether they should be housed on-premise, private, or public cloud environments. Going into those choices, organisations should ensure that the architectural decisions they make support open standards, that support portability, interoperability, and scalability.
Container technologies package applications so they can be run, with dependencies, isolated from other processes. The container can then be moved as a unit. Such technologies are the ‘middleware of change’, and allow applications to be built in a flexible way. Microservices break down applications to single-use services meaning that they can be deployed to any cloud without modification.
Don’t just migrate. Reinvent.
Moving to cloud entails strict controls for privacy and security. Due to this, many applications dependent on customer databases, transaction processing, finance, and supply chain manufacturing do not yet use cloud. It would be hard though not to see potential for intelligent workflows re-engineering in these areas.
Becoming a cloud service provider could be a new way for an organization to monetise its data, as customer data held in private cloud could be curated and provided to selected partners. Of course, all this will be subject to data privacy policies and AI ethics
Enterprise Experience and Humanity (4 of 4)