AI Servitisation

Products are less important than outcomes. We’ve seen that in the rise of music and video streaming services, which have eclipsed the market for physical media. It’s equally true in the workplace, where boxed software has been usurped by SaaS, and it’s evidenced by the rise of the ride-sharing economy, as pioneered by the likes of Uber and Lyft. Customers are less interested in owning a car, and more in experiencing an outcome. That outcome is their delivery to a given destination, which is a repeatable service for which they’re willing to pay many times over. 

“The way organisations interact with customers is changing,” explains Accenture. Traditional episodic encounters are being replaced by continuous personalised interactions.” 

This delivers previously unseen opportunities for ongoing revenue generation. So long as organisations know enough about their customers to continue satisfying their needs, they can shift from selling single products periodically, to maintaining and developing an ongoing relationship that delivers smaller but more frequent transaction over time. With every interaction, the relationship becomes deeper, and the manufacturer’s knowledge of its customer – and their needs – becomes more valuable and effective.

Servitisation and manufacturing 

Manufacturing organisations are therefore being forced to rethink their current operations as they look for opportunities to implement a servitisation model. Frequently, they can only do so by implementing an effective data pipeline. We can already see this in action. 

High value, high-tech products, like motor vehicles and jet engines, frequently send manufacturers a stream of data, which allows them to monitor fleets in real time to shorten the iteration cycle and provide supplementary, charged-for services. This is the basis of a classic servitisation model in which, “manufacturing businesses can offer additional services to supplement their traditional products such as maintenance, keeping a fleet of vehicles on the road as a service,” explains Ruth Raistrisk.  

“Servitization is usually a subscription model and can be applied to most industries in one way or another; be that £xx/month for music, £xx/month to keep a fleet of vehicles on the road, or even £xx/month for the fleet – all in!” 

Not all businesses have the resources of a tier-one manufacturer like Volkswagen or Rolls Royce and, fortunately, it’s not always required. Where a business lacks the ability to analyse data at scale, it can instead rely on the ongoing democratisation of AI.  

The democratisation of AI

While servitisation recognises that delivering outcomes is more important than delivering product, the same is true of the technology that underpins it: its ability to deliver a business outcome should always be the sole reason for implementing AI. It is the result, rather than the process, or ownership of the tools, that counts. 

The widespread availability of affordable cloud infrastructure, paired with pre-built intelligence, is opening up AI to a larger number of organisations, even if they lack the in-house expertise to develop their own algorithms. In doing so, it allows those who know a business best – its own managers – to self-serve, rather than commissioning purely technical departments to deliver reports on their behalf. 

“Democratisation makes AI available and accessible to the breadth of talent in an enterprise. Business users know the business in and out. Enabling them to build AI-powered applications using visual application development platforms, including those with drag-and-drop functionalities, can close the gap in data science talent,” said IDC’s Ritu Jyoti, speaking to IBM. She continues, “Most people think AI is all about technology, but it’s really more about business outcomes. Algorithms are important, but aligning to business goals brings greater relevance and competitive edge.” 

Merit’s own Senior Service Delivery Manager, who has helped several businesses across the UK to implement such AI technologies, says “The nanosecond processing algorithms are no longer limited to just Wall Street Trading. Algorithms have come a long way in creating market leaders.”  

The requirements of democratisation 

Full democratisation requires work in two key areas, according to Joe Hellerstein, co-founder and CSO of Trifacta: human/AI interfaces, so we can “focus on AI as an augmentation of human work, not a replacement”, and bringing people together across skill sets, so workers can go about their business as they see fit, while sharing successes. 

Low-code and no-code tools, like Google’s Vertex AI and Amazon’s SageMaker have a key role to play here, giving business leaders the opportunity to develop their own AI applications at speed. 

“Low-code and no-code AI tools provide organisations the opportunity to close the gap with the help of citizen data scientists that won’t need an AI expert to build custom AI solutions for many scenarios,” says Rene Schulte at Valorem Reply

Democratisation through cloud implementation 

Cloud is central to democratisation of AI. Not only does it allow business managers to draw on the experience of experts in infrastructure and intelligence development, but it makes it possible to scale up and down on the fly.  

When a new product is introduced, they can spin up additional capacity to cope with an initial rush; when a legacy service is retired, they can easily close off that capacity, or direct it elsewhere. Pay-per-use and responsive scaling likewise means costs directly follow revenue opportunities, allowing for more predictable, sustainable servitisation business models. 

This lowers the barriers to entry for organisations that lack the resources to implement their own on-premise or bespoke AI platform, and allows them to compete with larger players in a way that has never previously been possible. 

Responsible servitisation 

Yet, just because it’s possible for smaller organisations to implement AI doesn’t mean it should be done without due care and consideration. Done well, it can reinvent their business but, when implemented poorly, the results can do more harm than good. 

“As companies move toward democratisation, a cautionary tale is emerging,” writes PwC Global AI Lead, Anand Rao. “Even the most sophisticated AI systems, designed by highly qualified engineers, can fall victim to bias and can be difficult to explain.  

An AI system built by someone without proper training or that is operated without appropriate controls could create something outright dangerous, introducing discrimination or serious errors. Worse, the problems might not become evident until after a system has been implemented, leaving companies scrambling to reassure stakeholders, undo the damage, and fix the tech.” 

Merit Group’s expertise in AI 

At Merit Group, we work with some of the world’s leading B2B intelligence companies like Wilmington, Dow Jones, Glenigan, and Haymarket. Our data and engineering teams work closely with our clients to build data products and business intelligence tools. Our work directly impacts business growth by helping our clients to identify high-growth opportunities.   

Our specific services include high-volume data collection, data transformation using AI and ML, web watching, BI, and customised application development.   

The Merit team also brings to the table deep expertise in building real-time data streaming and data processing applications. Our data engineering team brings to fore specific expertise in a wide range of data tools including Airflow, Kafka, Python, PostgreSQL, MongoDB, Apache Spark, Snowflake, Tableau, Redshift, Athena, Looker, and BigQuery.   

If you’d like to learn more about our service offerings, please contact us here: https://www.meritdata-tech.com/contact-us 

Related Case Studies

  • 01 /

    Enhancing News Relevance Classification Using NLP

    A leading global B2B sports intelligence company that delivers a competitive advantage to businesses in the sporting industry providing commercial strategies and business-critical data had a specific challenge.

  • 02 /

    High-Speed Machine Learning Image Processing and Attribute Extraction for Fashion Retail Trends

    A world-leading authority on forecasting consumer and design trends had the challenge of collecting, aggregating and reporting on millions of fashion products spanning multiple categories and sub-categories within 24 hours of them being published online.