Big data: the big challenge

LinkedIn +

Big data analysis holds out the promise of big improvements in supply chain performance. But, asks Malory Davies, is your organisation ready to take on the challenge?


First published in Supply Chain Standard, March 2015.

First published in Supply Chain Standard, March 2015.

Big data has become a major issue in supply chain – we now have the ability to collect and store vast amounts of information and it would be incredibly valuable, if only we could actually process it fast enough. That’s the theory – achieving it is something else.

The scale of the data mountain we are now creating was highlighted in EMC Corporation’s seventh EMC Digital Universe study released last year.

This highlighted the fact that the digital universe is doubling in size every two years and will multiply ten-fold between 2013 and 2020 – from 4.4 trillion gigabytes to 44 trillion gigabytes. It puts this down to the “Internet of Things”, the emergence of wireless technologies, smart products and software-defined businesses.

The EMC study entitled “The Digital Universe of Opportunities: Rich Data and the Increasing Value of the Internet of Things,” with research and analysis by IDC, suggests that today, the average household creates enough data to fill 65 iPhones (32gb) per year. In 2020, this will grow to 318 iPhones.

But one of the problems is that there is no common view of what “big data” actually is. Lance Mercereau, chief marketing officer at Rosslyn Analytics, says: “Big data means different things to different people. To some, it’s a technology and to others it is a type of data. At Rosslyn Analytics, big data is merely a definition of complexity. Data is data. It’s also highly valuable. It’s why data should be viewed as a strategic asset in every organisation.”

Unfortunately, most teams don’t have the data they require to make decisions, says Mercereau. “Forrester Research estimates that 90 per cent of data in an organisation is not accessible by the business. Of the remaining ten per cent, half is of such poor quality it can’t be used. This leaves people making decisions using just five per cent of an organisation’s data. Something has to change.

“When data is in the hands of decision-makers, it’s transformational for the entire organisation. Data can reveal which of your suppliers may go bankrupt tomorrow, potentially disrupting your supply chain.”

There are new tools coming to market to analyse all this data, and Dawn Howarth of Oliver Wight recognises the value that big data analytics can add. But she sounds a note of warning – many organisations are not ready for big data. In fact, she says, some struggle to manage “little data” effectively. She describes “little data” as the day to day boring detail that nobody likes and hardly anybody looks after.

“You have got to have a level of maturity in the business before you can make the most of big data – you need underlying solid business processes.”

Most companies don’t understand the true potential of big data – and are three to five years from being able to do it.

She points out that companies that don’t have an effective planning process in place risk becoming distracted by the “here and now” – and there is a clear danger in focusing on that when the business needs to be planning longer term.

Howarth argues that big data analysis should be seen as part of an integrated business planning and looking some 24 months ahead.

More data doesn’t necessarily equal better data, so you need to know what you want to achieve at the start.


Collaboration and market segmentation is where big data comes into its own, says Howarth. The retail sector is probably the most advanced in terms of its use of big data, as its relates directly to the customer. It is probably less important in organisations further back along the supply chain as there is simply less data available.

The benefits of effective use of big data were highlighted by an Accenture study last year (see Table: Accenture Study – page 19).

The Accenture study found that an enterprise-wide strategy that enables a company to use big data to drive business value is most strongly correlated with big data analytics success. A supply chain-specific strategy for big data is the next best choice. While not as strongly correlated with the aforementioned metrics as an enterprise-wide strategy, a supply chain-specific strategy is still better than an ambiguous big data strategy focused on a few specific processes.

Accenture surveyed more than 1,000 senior executives primarily at large global companies for its report: Big Data Analytics in Supply Chain: Hype or Here to Stay?

A second key to generating more substantial returns is to embed analytics in the day-to-day supply chain operations This was found to generate more significant and far-reaching benefits than using big data analytics on an ad hoc basis in limited areas of focus.

Companies that employ a dedicated team of data scientists are far more likely to generate a range of important supply chain benefits from their use of big data analytics

The report also highlights the fact that big data analytics is a sizeable investment and must be thoroughly thought through in concert with the company’s overall data and analytics strategy, and with the outcome and supporting business case fully understood.

Mercereau points out that the benefits of big data are greater when it’s not viewed as a project, but a continuous analytical exercise that delivers a constant stream of information. “The benefits are also greater when it’s a journey, starting with short-term wins before moving onto other value added creations. The journey typically starts by ensuring your organisation is exploiting its internal supply chain data before adding external sources for a bigger, holistic view.

“For manufacturers, with complex supply chains, organisations need to understand not just the direct suppliers they buy from, but also those who indirectly contribute components or services across the extended supply chain. Start by mapping your entire supply chain, assessing the reliability of key suppliers and determining secondary suppliers in case a factory, for example, is closed due to a Tsunami or industrial action.



“One data solution that is increasingly being used by organisations is geo-location data. This involves placing suppliers on a map so the entire supply chain is visible. This is the baseline. Tagging suppliers with other information such as credit scores will allow procurement and supply chain teams to monitor in real-time the financial health of suppliers, informing them when suppliers fall below a certain pre-set ‘safe level’. The key is adding as much relevant information to the suppliers to paint a complete and rich picture of the entire interconnected supply chain. Rosslyn has developed technologies designed specifically for business users to access and turn complex data into meaningful information via its RAPid cloud data platform.”

Mercereau highlights three main challenges to implementing a data project. “These are, in sequential steps: the inability of organisations to access data that resides in so many disparate IT and reporting systems; two, transforming poor and incomplete data into actionable information so it can be analysed; and three, enabling decision-makers to not just analyse the newly improved data but to continuously change and enhance the data on-the-fly so it’s always relevant and up to-date.

“Fortunately, technologies have rapidly evolved and come onto the market that solve these three traditional challenges to implementing data projects.”

Ultimately, says Mercereau, big data can be transformational for an organisation. “Data is increasingly viewed as a strategic asset by organisations. It’s not surprising that a number of the largest companies are hiring chief data officers who are responsible for ensuring data is properly managed and taken care of for use by the business.”
Accenture study: Results of big data use

Improvement in customer service and demand fulfilment of 10% or greater:            46%

Faster and more effective reaction time to supply chain issues:        41%

Increase in supply chain efficiency of 10% or greater:           36%

Greater integration across the supply chain: 36%

Optimisation of inventory and asset productivity:    33%

More effective S&OP process and decision making: 32%

Improved cost to serve:          28%

 (Source Accenture report: Big data analytics in the supply chain)
SCS Survey: industry leaders give their view

SCS surveyed more than 540 senior supply chain professionals to get their views on the impact of Big Data.

It found that just over a third of companies are actually making use of big data at the moment. However, one in five were not sure.

At the same time, about 40 per cent said they thought their competitors were making use of big data. And the thought of ceding a competitive advantage to a rival has certainly had an effect. Some 20 per cent said they had made deliberate investments or changes while 40 per cent said they had been encouraged to look at it. However, there were a few ( one per cent) who thought the whole big data issue was just hype.

Does the ability to use big data give a significant competitive advantage – three out of ten said yes. But 40 per cent thought any advantage would probably be marginal.

And it was clear from the survey that most people thought retail was leading the way in the use of big data – the only industry sector to score over 50 per cent. It was followed by hi-tech (43 per cent), and Automotive (40 per cent). However, it was seen as significantly less useful in manufacturing and third party logistics – just 15 per cent each.


Technology: Internet of Things builds data mountain

The EMC Digital Universe study looks at the impact of the growth of the Internet of Things – billions of everyday objects that are equipped with unique identifiers and the ability to automatically record, report and receive data.

According to IDC, which was responsible for the research, the number of devices or things that can be connected to the internet is approaching 200 billion today, with seven per cent (or 14 billion) already connected to and communicating over the internet. The data from these connected devices represents two per cent of the world’s data today. IDC now forecasts that, by 2020, the number of connected devices will grow to 32 billion – representing ten per cent of the world’s data.

The Internet of Things will also influence the massive amounts of “useful data” – data that could be analysed – in the digital universe. In 2013, only 22 per cent of the information in the digital universe was considered useful data, but less than five per cent of the useful data was actually analysed – leaving a massive amount of data lost as dark matter in the digital universe. By 2020, more than 35 per cent of all data could be considered useful data, thanks to the growth of data from the Internet of Things.

This phenomenon will present radical new ways of interacting with customers, streamlining business cycles, and reducing operational costs, stimulating trillions of dollars in opportunity for businesses. Conversely, it presents significant challenges as businesses look manage, store and protect the sheer volume and diversity of this data.

Currently, 60 per cent of data in the digital universe is attributed to mature markets such as Germany, Japan, and the United States, but by 2020, the percentage will flip, and emerging markets including Brazil, China, India, Mexico and Russia will account for the majority of data.


Case study: Big data under scrutiny

Leanne Lynch, head of operations planning at British and American Tobacco will discuss the use of technology and big data to drive supply chain improvements at the Logistics and Supply Chain Conference in London later this month.

Lynch will look at how BAT uses analytics to gain insight into true performance, how to define the need for change through accurate project scoping and requirements, and how the firm ensures delivery and on-going improvements through benefits tracking.

The Logistics and Supply Chain Conference takes place at Dexter House in central London from Wednesday 18th to Thursday 19th March 2015.


Case study: John Lewis picks Stibo system for PIM

John Lewis has selected Stibo Systems’ STEP product information management system as a key part of its omni-channel retail strategy.

The aim is to deliver accurate and specific product information consistently across all of the retailer’s channels. In addition, John Lewis expects to reduce the time it takes to bring products to market and drive operational improvements in handling customer queries and processing product returns.

“Presenting customers with compelling product content is a critical part of being a leading omni-channel retailer. Our investment in Stibo Systems STEP will significantly enhance our capability to deliver great content in a timely manner.” says Dave Suddock, head of buying & brand operations at John Lewis.

The STEP platform is designed to help companies extract and consolidate product, customer and supplier information from a variety of systems. Once the information is rationalised, STEP captures and maintains all product attributes and relationships via a rich data model, while cleansing the data to prevent inaccurate information from being loaded into the system and maintaining data integrity. STEP also shares operational information with your applications and analytical systems based on predetermined distribution mechanisms.


Share this story: