Roberto Maranca1 and Michele Staiano2, 1Data Excellence-Schneider Electrics, UK, 2University of Napoli Federico II, Italy
The “data supply chains” (DSCs), which are connecting the point where physical information is digitized to the point where the data is consumed, are getting longer and more convoluted. Although plenty of frameworks have emerged in the recent past, none of them, in the authors’ opinion, have so far provided a robust set of formalised “how to”, that would connect a “well built” DSC to a higher likelihood to achieve the expected value. This paper aims at demonstrating: (i) a generalized model of the DSC in its constituent parts (source, target, process, controls), and (ii) a quantification methodology that would link the underlying current quality as well as the legacy “bad data” to the cost or effort of attaining the desired value. Such approach offers a practical and scalable model enabling to restructure at its foundation some practices of data management priming them for the digital challenges of the future.
Data Management, Data Supply Chain, Quality, Complexity, Value.