“Better use of data can help organizations of every kind succeed – across the public, private and third sectors. It can support the delivery of existing services, from manufacturing to logistics, and it can be used to create entirely new products,” wrote UK digital secretary Oliver Dowden. “Data is now the driving force of the world’s modern economies. It fuels innovation in businesses large and small and has been a lifeline during the global Coronavirus pandemic.”

It’s easy to see why no business can survive in isolation. From the sole enterprise to organizations working across the European market, and the largest multinational, each relies on deep connections with suppliers, customers, and intermediaries. These can only be managed effectively with the free flow of data throughout the business ecosystem.

This data may be particular to the organization itself, like customer lists and records of past sales – or, it could be public or bought-in. As long as it’s accurate, its origin doesn’t matter. What’s important is the insight it can deliver.

The Business Ecosystem

Each business sits at the centre of a unique ecosystem that stretches further than most recognize. They are both a supplier and a consumer, relying on the expertise, services, and products of others, while providing the same.

However, the physical assets that form the visible surface of this ecosystem are less valuable than the data that underpins it. It’s important that organizations ensure these foundations are stable, so they can contribute to its ongoing success.

Controlling the flow of data with APIs

APIs have long been important conduits for the data on which organizations rely, allowing them to share a subset of their digital assets with external service providers. As such, they sit at the heart of the data-driven business ecosystem. They give the business the space it needs to concentrate on revenue-driving activities while logistics, accounting and cloud productivity providers – to name just three – perform ancillary functions that don’t require in-house expertise.

The Benefits of Using APIs

Using an API, the client controls how much of its data it exposes, and in which format, without having to open its internal systems, build an external portal or work through an ETL stage. It also provides an effective means of terminating its providers’ access at the end of the contracted period.

“Outsourcing non-core activities can improve efficiency and productivity because another entity performs these smaller tasks better than the firm itself,” notes Investopedia. “This strategy may also lead to faster turnaround times, increased competitiveness within an industry and the cutting of overall operational costs.”

If it does, it indicates that the data is making a positive contribution to overall value within the enterprise.

Consulting is key

Data can generate profit, by delivering the insight required to formulate new products and services. It can also reduce costs, by projecting into the future to warn of upcoming problems – or opportunities that would ultimately fail to deliver on their apparent promise without accurate data in play.

This is most effectively done by combining owned data with alternative data, which gives it greater context. Alternative data, like foot traffic, weather forecasts and demographics, can be mined for insights that enrich those already derived through existing BI (business intelligence) activities.

It is vital, therefore, that organizations consult with experts in their own field as well as consulting in the wider business ecosystem to enrich the data they already hold, drive insight, and push the business forward. They should be looking for non-core sources, like pricing adjustments and interest rates, which could impact demand for products and services; currency fluctuations when planning logistics and shipping; and changing tastes as they ideate for the future.

Latency and its impact on success

Data must be timely if the insight it delivers is to have any value. However, the exact definition of ‘timely’ will depend on context.

When analysing financial markets, fractions of a second make a difference; when planning for the launch of a new car, longer-term trends are more important. In either instance, though, the latency, or lag between the data being generated and its readiness for use, must be kept to a minimum.

Latency is important on two levels. Traditionally, it is discussed in the context of network infrastructure, and the rate at which it can deliver a response. Minimizing latency at this level remains a key consideration for financial trading organizations and other businesses for which split-second timing can impact profits and viability.

Methods for reducing latency

Solutions can include moving the data centre closer to the origin of external data, or moving owned data to the edge. When operating in a region with broadly compatible data handling and protection regulations, such as the European market, relocating data in this way is significantly less complex than it could be in more fragmented territories.

In the broader context, latency isn’t something any business can afford to overlook as, where data is concerned, ‘better late than never’ doesn’t always apply.

To thrive within a business ecosystem, organizations must ensure that the insights on which they base decisions are relevant and timely. Insights derived from stale data are less valuable at best, and at worst can feed decisions that fail to maximize profit.

Solutions that deliver value

Data should always be used with the aim of delivering value, although this doesn’t always equate to a monetary return in the short term. Value could represent an investment for the future, an improvement in ongoing practices or an efficiency in one area that will deliver value in a seemingly unconnected department.

More efficient use of personnel data may not only reduce recruitment costs, for example, but could also enable revenue-generating departments to re-staff more quickly.

Data-backed business intelligence

This helps organizations understand their operating environment, the outcomes of past decisions, and opportunities for the future. It relies on a stream of timely data, like a 360-degree view of customers, suppliers and the larger market, much of which is generated and iterated in-house.

Sales data can be used to enrich customer records and recommend future purchases. This kind of recommendation engine has helped Amazon become one of the world’s biggest retailers. “Once it recognizes patterns in what customers buy, which customers have similar buying patterns, and what items are bought interchangeably or paired together, it can curate the perfect product pitch for any situation,” explains MDM. This also highlights Amazon’s low bounce rate and longer dwell time. Clearly, the data is delivering immediate value not only to Amazon, but its customers, too.

Analysing aggregate data from a large user base

Many SaaS (Software as a Service) providers do this to help enterprise or outsourced providers to better understand its own product, how it is used in the wild and how it can be expanded through the addition of relevant features.

More immediately, it also provides the underlying data required by machine learning and artificial intelligence to proactively forecast and remedy potential issues before they become costly points of failure. This reduces maintenance costs, minimises user churn and, in both of these ways, derives value from the gathered metrics.

Data-driven agile work processes

Consulting data as part of an agile working pattern allows teams to monitor progress and the impact of past actions in real time. Working in ‘sprints’ of four weeks or less, with ongoing collection of metrics and their regular analysis helps identify potential missteps before they impact on future work blocks, potentially avoiding the cost of taking remedial action.

Each of these use cases is different and can be deployed individually, alternatively, or simultaneously, depending on the business model. While they’re not directly comparable, it is nonetheless possible to analyse how and to what extent they deliver value for the business.

“The most important metric to track for value delivery is time to value,” says Retina AI, which recommends that “this should take precedence over magnitude of value. End-users would prefer to receive value faster even if it’s small than to wait a long time and receive some magnificent cathedral of value.”

Infrastructure and expertise are evolving

To maximize its value, it is essential that data remains portable and is not locked inside silos. This has long been recognized, but it is becoming more important as SaaS and PaaS (Platform as a Service) emerge as the dominant model for responsive IT. These allow organizations to spin-up and decommission resources on the fly in response to changing business needs.

In 2020, spending on cloud infrastructure services outstripped that for traditional data center hardware and software for the first time. Portable data frees the organization to invest in ‘soft’ infrastructure like this, rather than physical hardware, which allows for more efficient, focused management and, often, improved ‘value’ because of reduced costs.

Well-known names, including Google, Amazon, Microsoft and ServiceNow, are heavily invested in providing PaaS to clients of all sizes, while the likes of Salesforce, Zendesk and Cisco Webex deliver more focused SaaS offerings that satisfy core business needs. In each instance, a large proportion of the configuration and maintenance tasks are managed by the service provider itself.

ServiceNow and the future of data

Santa Clara-based ServiceNow provides PaaS infrastructure and digital workflows, which allow its clients to use their own data to optimize productivity, cost, and resilience. It plugs the expertise gaps in its clients’ organizations, allowing them to maximize the value of their data by making ServiceNow – a third party – an integral part of their business ecosystem.

In September 2020, it released a significant update, which further expanded its services across a range of industries, including “applications to help banks issue credit cards and to enable telcos to manage services and network performance, showing how serious the company is about organizing workflows across the enterprise, not just in the IT department”.

But ServiceNow isn’t as proscriptive as that might make it sound: with low-code and no-code tools rolled out in March 2021, the platform also allows customers to build their own processes, which run in the cloud and use their own data.

Could this be the future? If ServiceNow’s phenomenal growth and market value are anything to go by, it certainly looks that way. In July 2021, the company beat analysts profit forecasts by close to 190% and, in less than 20 years, it’s gone from start-up status to revenues in excess of $3bn. ServiceNow’s growth neatly demonstrates how it’s possible to build a business around extracting value from data.

Related Case Studies

  • 01 /

    A Unified Data Management Platform for Processing Sports Deals

    A global intelligence service provider was facing challenge with lack of a centralised data management system which led to duplication of data, increased effort and the risk of manual errors.

  • 02 /

    Automotive Data Aggregation Using Cutting Edge Tech Tools

    An award-winning automotive client whose product allows the valuation of vehicles anywhere in the world and tracks millions of price points and specification details across a large range of vehicles.