Supply chain management has experienced a paradigm transition over the years which is evident in the move from the traditional focus of cost-reduction and efficiently managing the balance between demand and supply (Lambert, Cooper and Pagh, 1998) to a modern-day focus on cost effectiveness, reliability, predictability and quality leadership (Stock, 2013). With recent advancements in technology and the increasing pressure of globalization acting as antecedents, companies are increasingly reliant on agile, reliable supply chains which can effectively ensure “a bidirectional flow of information, products and money between the initial suppliers and final customers through different organizations”(Nurmilaakso, 2008). Consequently, supply chains are now considered a key determinant of a company’s competitive advantage such that the present market space alone is no longer sufficient as a competitive frontier between companies, but between their supply chains (Gunasekaran and Ngai, 2004; Ketchen and Hult, 2007) .
The change in the supply chain paradigm has also coincided with a shift in customer preferences with customers increasing seeking higher quality products as opposed to mass produced generic products (Deveshwar and Rathee, 2010). A typical illustration of this can be seen in Toyota’s rise to dominance in the automobile sector. Using its lean manufacturing principle which is focused on quality leadership, Toyota overtook industry mainstays such as Ford as the largest car manufacturers in the world (Clark, 2006) . Evidently, quality assurance has positive impacts on the profitability of a business and quality challenges negatively impact the competitive advantage and sustainability of the business. Consider some of the following supplier quality issues which adversely impacted the market shares and brand image of businesses involved:
- Samsung Galaxy Note 7 explosion due to faulty battery supplied by battery manufacturers (Shamsi, Ali and Kazmi, 2017)
- Horse meat Scandal where supplied products labelled as beef actually contained horsemeat (Crane and Brown, 2013)
- Kobe Steel scandal where the steel manufacturers falsified reports, stating that products supplied to customers such as Toyota and Boeing, met the required quality specifications (Shane, 2017).
Various authors (Gimenez, Sancha and Mendoza, 2008; Groznika and Trkman, 2012; Ambe, 2014; Bala, 2014; Brosch, 2015) all cite quality inconsistencies as one of the significant challenges in supply chain management with specific emphasis quality management and the inability to accurately evaluate the quality of materials supplied to companies which depend heavily on the efficiency of their supply chains. With the quality of products provided by suppliers having a direct impact on the quality of the final product, both academic literature and industry practitioners, lack a comprehensive tool which and wholly address quality issues in the supply chain. This dovetails into the importance of analytics, and the role Big Data could play in ensuring quality in supply chain management.
1.2 Big Data’s Prospective Role
Business data analytics allows businesses to revolutionize their supply chains to be more customer-focused, agile and responsive with significant advantages in forecasting, predictability, visibility and quality assurance (Khan, 2013; Wamba and Akter, 2015; Biswas and Sen, 2016; Norma Harrison; Simon Rowe, 2017). Moreover, the emergence of big data and the ubiquitous nature of structured and unstructured data across the supply chain, the implementation of big data analytics, despite being sparse in real-world application presents significant prospects for improving quality assurance in supply chains and improving available information for decision making.
Big data has been described (Fosso Wamba et al., 2015; Akter et al., 2016) as “a holistic approach to manage, process and analyze the 5 data-related dimensions, volume, variety, velocity, veracity and value, in order to create actionable insights for sustained value delivery, measuring performance and establishing competitive advantages”. Essentially, big data analytics presents an opportunity capitalizes on the plethora of data available from channels such as social media, physical sensors, cloud computing and mobile technology to provide insights into product, consumer or business trends and characteristics, thus delivering extensive information for strategic planning, management and decision-making
Furthermore, authors (Biswas and Sen, 2016; Moyne and Iskandar, 2017) have elaborated the prospects of this technology with particularly reference to the industrial shift towards Industrie 4.0 – a global manufacturing technology paradigm which utilizes various innovative technologies including cognitive computing and the internet of things to facilitate smart manufacturing where all objects across the supply chain act as sources of structure and unstructured data. Consequently, the prospects for big data’s role in quality assurance, such as fault detection and fault classification (Moyne and Iskandar, 2017), within the supply chain requires extensive theoretical conceptualization and research, which is currently lacking in existing literature. Accordingly, the premise for this study is based on the prospect of big data in ensuring quality in supply chain management and addressing the existing lack of academic literature, which can form the theoretical backdrop for real world applications.
1.3 Total Quality Management in Supply Chain Management
The Total Quality Management (TQM) framework identifies quality as a key component of business principles and as asserted by Gunasekaran and Ngai (Gunasekaran and Ngai, 2004) plays a key role in improving the overall efficiency of supply chains. Over the years, several researchers such as Juran and Gryna (Juran and Gryna, 1993), Deming (Deming, 1986), Crosby (Crosby, 1979), Feigenbaum (Feigenbaum, 1991), Ishikawa (Ishikawa, 1985) have all postulated various definitions and approaches to TQM. Despite a lack of literature on the relationship between TQM and Big Data in supply chain management, TQM’s core principles highlight the prospects for, and pertinence of Big Data’s in supply chain management. The TQM framework relies on 6 core principles, which Chang (Chang, 2009) outlines in relation to supply chain management
- Customer focus: This relates to identifying the needs and expectations of end users and all stakeholders within the supply chain and ensuring that meeting these expectations is a priority
- Leadership: This emphasizes the role of leadership’s buy-in into the quality assurance philosophy and their commitment in establishing corporate cultures, policies and strategic plans that ensure quality products and services.
- Stakeholders/employees involvement: This emphasizes an organizational culture which encourages contribution, creativity and development of employees, resulting in increased employee efficacy, enthusiasm and job performance
- Process management: This involves the evaluation and improvement of all activities within the supply chain such as procurement, logistics, production, inventory, selling, service, and managing these processes efficiently to ensure synergy and consistent quality assurance
- System management: This involves a holistic perception of the supply chain, encouraging integration and synergy across various systems and sub-systems and
- Continual improvement: This principle focuses on consistent efforts in improving processes and maintaining quality standards of products and services
- Mutually beneficial supplier relationships: This principle requires organizations to establish transparent and collaborative relationships with suppliers to ensure that raw materials supplied meet the required quality standards to produce high quality products and services
- Factual approach to decision-making: This principle focuses on the generation and utilization of adequate and extensive data and information to inform decision making across the supply chain. Moreover, organizations must be able to capture data by utilizing the surfeit of existing technologies with the aim of identifying and rectifying faults in a timely fashion and establishing end-to-end quality assurance within the supply chain.
Considering the background discourse on big data presented hitherto, it can be argued that big data, which essentially, intends to utilize vast array of data to inform decision-making, can play a significant role in ensuring quality in the supply chain by supplementing the TQM framework. Research by Wamba and Akter (Akter et al., 2016) provides a theoretical appraisal of existing literature on the application of Big Data on supply chain management and Biswas and Sen (Biswas and Sen, 2016) conceptualized an architecture for Big Data’s application in supply chain management. Despite these studies lending relevance and expanding the literature on Big Data’s importance in facilitating various benefits for supply chains, none of these existing studies proposes a well-defined framework for incorporating Big Data into the total quality management framework or utilizing it for quality assurance within supply chain. This assertion dovetails with the aims and objectives of this study and lends justification for this study.
1.4 Aims and Objectives
Quality assurance plays an increasingly significant role in determining an enterprises’ competitive advantage and sustainability (Flynn* and Flynn, 2005; Robinson and Malhotra, 2005; Matthews, 2006) . Moreover, products with inherent quality issues tend to negatively impact the profitability of business through product recalls, redesign and ensuing legal issues. However, existing studies (Flynn* and Flynn, 2005; Robinson and Malhotra, 2005; Matthews, 2006) on Big data have focused on logistics, product tracking, customer profiling but few studies have investigated quality assurance and how big data, and its associated technologies, can ensure quality.
Considering the prospects of big data in quality assurance within the supply chain, the purpose of this study is to consolidate existing knowledge on big data’s application and present comprehensive insight into the prospective impact big data can have on the supply chain quality assurance. Furthermore, this study intends to proffer a feasible framework for its application in real-world scenarios. This all boils down to the main aim of this study “What prospects does Big Data have in the supply chain management to ensure quality assurance on products?” Consequently, resolving the aim the following objectives need to be addressed to meet the purpose of this study:
- Elucidate the current state of big data application in Quality Assurance in Supply chain management.
- Identify the areas of quality risk which big data can mitigate.
- Develop a framework for the application of Big data in quality assurance within the supply chain.
- Provide insight into the impact big data could have on quality assurance in the supply chain.
Considering the research aims and objectives of this study enumerated above, as well as the shortage of extensive literature implementation of Big Data in quality assurance frameworks in supply chain management, this study aims to utilize primary data analysis to address the purpose of this study.
The role of quality assurance in supply chain management is a growing area of research with studies all highlighting the importance of maintaining satisfactory levels of quality across the value chain of any business (Mohammed and Al-musleh, no date; Noori, 2004; Lin et al., 2005; Zhang, Chang and Yu, 2006). In light of this, theories such as TQM and various works on big data form the crux of the appraised literature for this study and would be used to create a suitable theoretical framework for the study thereby addressing the 3rd research objective.
Ultimately, this study aims to utilize and consolidate existing literature to establish a suitable framework and investigate current implementations of big data in TQM which can constitute the basis of future theoretical and empirical investigations for supply chain management.
Chapter 2 : LITERATURE REVIEW
To carry out a thorough research on big data analytics, the fundamental aspect of role of data analytics in quality assurance needs to be understood clearly before studying the added value of big data analytics. This chapter presents detailed literature review for expanding the theoretical foundation relevant to the research background presented in the earlier chapter by establishing the essential knowledge for meeting the aim and objectives of this research.
This research aims to study the prospects of big data analytics on quality assurance in supply chains. The literature review has been conducted to first study the concepts of application of data analytics in quality assurance with special emphasis on total quality management and supply chain performance, and then study the added value by big data on data analytics. In this process, the key variables related to value of big data analytics in the field of data analytics (the independent variables) and the key variables of the application of data analytics in quality assurance (the dependent variables) were evolved. These directions can be interpreted clearer with the help of Figure 2.1.
The initial structural construct is a draft model for further research that shows the independent and dependent variables, the moderators and mediators if any, and their initial theoretical interrelationships (Hair et al., 2016). After completing the data collection process, the construct can be analyzed through multivariable statistical methods to reduce it to a level of optimum validity at which, it can be accepted as the outcome of the research (Hair et al., 2016).
Figure 2.1 Directions for Literature Review and finalization of the Initial Structural Construct
2.2.1 Supply Chain Management (SCM)
Supply chain is a value network comprising the production pants, suppliers for the production plants, storage and warehousing partners, freight and forwarding partners, distribution partners, and retail/marketing/sales partners (Jacobs and Chase, 2014). Supply Chain Management (SCM) is a management discipline for managing relationships, collaboration, communication, integration, processes, and tasks related with all the partners contributing to operations of the logistics networking for an organisation to achieve smooth operations, cost effectiveness, and high-quality customer deliveries. The key variables synchronised by a supply chain manager are quality, quantity, time, place, and money thereby making the main goal to have the right product, in the right quantity, at the right time, in the right place (Jacobs and Chase, 2014). The supply chain has three flows going throw it, the product flow which moves downstream to the customer, the monetary flow which moves upwards towards the suppliers and information flow which moves in various directions and is essential for integration (Christopher, 2011) . In the modern world, SCM is a way of achieving competitive advantages over the competitors by ensuring value for customers and managing streams of values flowing to the customers. This is a major shift different from a firm-centred setting as it forms a value chain of an extended enterprise which all partners in the chain are active players through collaborative relationships and agreements with the main focus on the aligning the value chain towards the customer needs and demands (Christopher, 2011). Such an arrangement is possible through strategic partnerships with suppliers and customers and ensuring appropriate collaboration and communications with them (Reid and Sanders, 2016).
A value chain with strategic partnerships ensures benefits for customers, manufacturers, and their suppliers in the form of reduction of costs, reduction of time-to-market, reduction of unnecessary inventory, better visibility into the demands, new innovative solutions suitable to customer demands, increasing sales, and increasing quality (Reid and Sanders, 2016). A supply chain designed as a value chain can have a long-term orientation with market dynamics and changing customer demands, and can align the visions of customers, manufacturers, and their suppliers (Reid and Sanders, 2016). Such a supply chain is totally driven by what customers want; now and in the future (Reid and Sanders, 2016).
The concept of supply chain as a value chain has evolved the model of virtual supply chain and customer-driven virtual manufacturing, which is against the traditional concept of push (mass production) and pull (demand pull) manufacturing (Márquez, 2010). Through this concept, manufacturers and their suppliers benefit from strategic integration with customers in such a way that the concept of demand forecasting is replaced by the concept of real-time demands visualisation (Márquez, 2010). The customer orders are not processed in batch mode albeit the individual customer orders are fed to the transactional databases as and when they happen (Márquez, 2010). The big data analytics discussed later in this chapter (which is the focus of this research) is a based on this concept. The emerging concept and technology of real time data collection and analytics from customers-end and suppliers-end has a potential to change the way supply networking has been managed traditionally. This research had found significant relationships between total quality variables and systems of real-time data analytics from running processes at the suppliers end and real-time visibility into supplies and their costs. The concept of total quality management is introduced in the next section.
Total quality management (TQM) is about unbiased commitment to quality in eveything that an organisation does (Johnson, Foley and Britain), 2002). TQM cannot be managed as a department, a function, or a project. TQM requires cultural absorption of quality culture in an organisation as a way of life. It is a philosophy. The core concepts of TQM are the following:
(a) Continuous identification of quality gaps and making continuous improvements in every process, task, and activity in the organisation
(b) Empowering people to take quality-centric decisions and not getting carried away by the bureaucracies of power at the cost of quality assurance to customers
(c) Mutual trust and teamwork for identifying quality issues and making improvements
(d) Alignment of everything to the voice of the customer’s needs, wants, and complains)
The focus of TQM is on customer-focussed quality orientation achieved through building quality as a parameter in every process, product, and service, continuous improvements and strategic quality leadership, and inducting quality as a responsibility of every employee and not just a quality management function (QMF) (Reed, Lemak and Mero, 2000; Sadikoglu, 2008; Reid and Sanders, 2016). Having said this, it needs to be emphasised that the role of a centralised QMF is not diluted because of TQM. The QMF comprises of specialised skills in translating customers’ quality requirements into design specifications, structuring of the design cycles, optimising the design cycles in industrial and process engineering, ensuring sound quality control of products and services, conducting auditing, managing compliance to standards like, ISO 9001), and documentation and records management (Condrea, Stanciu and Aivaz, 2012).
The data on customers’ requirements is the key input for TQM (Kuo, Wu and Deng, 2009). The perceived quality of products and services from customers’ perspectives drives customer satisfaction (Kuo, Wu and Deng, 2009). Hence, an organisation needs to identify the key quality parameters in every product design, process, and services that may influence the quality perceptions by the customers (Sadikoglu, 2008; Kuo, Wu and Deng, 2009). To achieve TQM with customer orientation, an organisation needs the right tools and methods for making quality-driven decisions (Reid and Sanders, 2016). Data collection and statistical methods helps in gaining deep insight into the deviations from quality measures (Reid and Sanders, 2016). The key statistical techniques used are analysis of means, analysis of variances, probability distributions, statistical significance testing, and statistical inference testing (Montgomery, 2008) . Statistical methods in quality help in implementing the DMAIC process in TQM, defined as the following (Montgomery, 2008); Designing by defining the quality objectives, key customer requirements specifications, and the quality performance measures that can reflect the quality of the requirement specifications. Measuring by collecting data from the processes and systems and statistically measuring them in the context of the performance measures, analysing by exploring the key opportunities for improvements reflected in the outcomes. Improving by implementing the identifying opportunities for improvements and lastly, controlling by implementing controls for sustaining and improving the quality performance achieved.
The real-time data collection and analysis for TQM through big data analytics has been studied in this research. As the research is focussed on TQM in SCM, a brief introduction of TQM in SCM is presented in the next section followed by a detailed review and analysis of application of data analytics in the quality assurance of a supply chain.
2.3 Total Quality Management as a Tool in Supply Chain Management
In supply chains, the concept of total quality management (TQM) incorporates multiple domains: customer focus and relationships management, process management (both upstream and downstream), logistical systems management, inventory and procurement management, strategic supplier relationships, cost and revenue management, resources management, services quality management, environment protection, and quality leadership (Lin et al., 2005; Christopher, 2011; Foster, Wallin and Ogden, 2011; Talib, Rahman and Qureshi, 2011; Dellana and Kros, 2014).
TQM can ensure quality assurance in meeting the SCM objectives through internal and external process integration, internal and external information systems integration, data/information analysis, knowledge discovery and management, lean strategy in resources management, and continuous improvements through Six Sigma (Lin et al., 2005; Pepper and Spedding, 2010; Vanichchinchai and Igel, 2011). A generic framework of TQM in SCM is presented in Figure 2.2. The principles of lean and six-sigma are the backbone for Total Quality Management in SCM. The principles of lean and six-sigma in quality assurance can be applied in all these areas ensuring optimal utilisation of all available resources with lean philosophy and continuous improvements in everything done within the SCM scope of six-sigma philosophy (Pepper and Spedding, 2010).
TQM can ensure quality assurance in meeting the Supply Chain Management (SCM) objectives through internal and external process integration, internal and external information systems integration, data and information analysis, knowledge discovery and management, lean strategy in resources management, and continuous improvements with Six Sigma (Lin et al., 2005; Pepper and Spedding, 2010; Vanichchinchai and Igel, 2011) . A generic framework of TQM in SCM is presented in Figure 2.2. The principles of lean and six-sigma are crucial for Total Quality Management in SCM. The principles of lean and six-sigma in quality assurance can be applied in all these areas in order to ensure optimal utilization of all available resources in lean philosophy and continuous improvements of everything done within the SCM scope using six-sigma philosophy (Pepper and Spedding, 2010).
Both lean and six-sigma require a measurement system to be in place used for running pre-designed performance tests on sample data collected from the transaction systems running for operating the systems, processes, and management tasks like, the online transaction processing databases (Cherrafi et al., 2016) . The tests may be conducted using statistical analysis and plotting of probability distribution curves to judge the confidence level in the test results (C. and H., 2008; Cherrafi et al., 2016). In supply chains, the concept of total quality management (TQM) incorporates multiple domains of customer focus and relationships management, upstream and downstream process management, logistical systems management, inventory and procurement management, strategic supplier relationships, cost and revenue management, resources management, services quality management, environment protection, and quality leadership (Lin et al., 2005; Christopher and Holweg, 2011; Foster, Wallin and Ogden, 2011; Talib, Rahman and Qureshi, 2011; Dellana and Kros, 2014).
Figure 2.2 Generic Framework of TQM in SCM (Lin et al., 2005; Christopher and Holweg, 2011; Foster, Wallin and Ogden, 2011; Talib, Rahman and Qureshi, 2011; Vanichchinchai and Igel, 2011; Dellana and Kros, 2014)
The six-sigma philosophy is used by companies to detect and eliminate defects at large scales in the running processes and tasks (Kumar et al., 2008). This philosophy can be used optimally by implementing automated data collection and continuous analysis systems for quality and performance measurements (Snee, 2010). The result is improvement of processes and tasks through reduction of defects and wastes at every level of sigma whereby sigma is the standard deviation from the measurement mean on a normal distribution curve (C. and H., 2008). The scope of TQM in SCM presented in Figure 2.2 requires these systems for continuous data collection, continuous measurements, and continuous improvements in every area. The TQM framework implemented through the path of lean six-sigma involves the stages of defining, measuring, analyzing, improving, and controlling, as presented in Figure 2.3:
Figure 2.3 Stages of Lean Six-Sigma (Thomas, Barton and Chuke‐Okafor, 2008).
The lean six-sigma requires first defining the value stream and then collecting data through a continuous flow system. The key quality and performance objectives for SCM achievable through this system of defining value stream, automated data collection, continuous analysis, and perfections through continuous improvements are discussed below.
(a) Assurance of lead times for supply (Christopher and Holweg, 2011; Cho et al., 2012; Martin, 2014): The lead times assurance of supplies should be managed in such a way that there is never a stock out situation at the demands fulfilment end. Further, the lead times also need to be managed in such a way that there is minimal inventory wastage whenever the demands fluctuate. This means that the lead times should be flexible as per the variations in demands. Hence, there should be continuous flow of demand-related data upstream to make the demand forecasting highly accurate. Six-sigma can be a very useful tool for improving lead times assurance in small steps.
(b) Quality assurance of the customers’ deliveries (Christopher and Holweg, 2011; Cho et al., 2012; Martin, 2014): Quality assurance in deliveries made to customers requires maturity in the processes interfacing with the customers. Examples are: order entry, order lead time, order processing path, capacity and its utilization, inventory control processes, customer relationship management, delivery flexibility and costing management, accuracy of forecasting, and IT systems enabling and management. There needs to be a continuous flow of data from the customer facing processes to the analytics engines. Six-sigma can be used to improve quality assurance to the customers in small enhancement steps.
(c) Reduction of process inefficiencies (Arnheiter and Maleyeff, 2005; Christopher and Holweg, 2011; Cho et al., 2012; Martin, 2014): A process should efficiently meet its objectives and goals. Process inefficiency is a percentage measure of how much a process has missed its objectives and goals over a period. Continuous flow of data from the running processes to the analytics engines is needed. Six-sigma can be used to improve process inefficiencies in small enhancement steps. Combing with lean philosophy, the inefficiencies can be reduced through optimal utilization of the existing resources and capacities through cross-skilling and cross-purpose facilities, working with small batch sizes, just-in-time supplies, automated inspections, reduced work-in-progress inventory, and highly efficient production systems like, low setup times, faster and automated processing, and modular outputs.
(d) Reduced defects and errors in completing the supply chain tasks (Christopher and Holweg, 2011; Cho et al., 2012; Martin, 2014) : This is the process of continuous detection and elimination of defects and errors through continuous data flow from the tasks’ logs and their analysis.
(e) Eliminating process steps or entire processes causing wastage of time and money (Lin et al., 2005; Christopher and Holweg, 2011; Martin, 2014): This variable is not merely about defects and errors albeit is about the value addition of an entire process or many process steps of a process. The idea is to challenge the need for an entire process and the related allocation of resources of costs. If a process is not adding value to the final quality assurance and performance outcomes, it is a waste. Some of the examples of waste processes are: over-production from production not aligned effectively with demands, unnecessary transportation and movement of resources/products/inventories/people, over-processing of products, unnecessary waiting times, unnecessary holding of inventory, and all activities and processes triggered after products’ rejection and reworking instructions issued by customers. Waste processes can be eliminated through continuous data collection from processes and continuous performance measurements.
(f) Quality assurance by suppliers and performance of suppliers (Lin et al., 2005; Christopher and Holweg, 2011; Talib, Rahman and Qureshi, 2011; Vanichchinchai and Igel, 2011): This variable requires effective relationship management, collaboration, and control of procurement and suppliers’ performance. A system of data collection from procurement and suppliers’ deliveries need to be in place for continuous monitoring and measurements of suppliers’ performance. There should be separate quality assurance policies applied on the suppliers. The actors in scope should be suppliers, third party logistics, outsourced production companies, subcontractors, and job workers.
(g) Improved competitiveness of the supply chain (Lin et al., 2005; Trkman et al., 2010; Christopher and Holweg, 2011; Parmigiani, Klassen and Russo, 2011; Talib, Rahman and Qureshi, 2011): Competitiveness of supply chains is enhanced through market-oriented configuration changes, capacity and capability enhancements, lean operations, flexibility, agility, ensuring social and environmental performance, and enhancements of controls. For a sustained competitive edge, the supply chain should undergo continuous enhancements in relational, technical, and stakeholder performance. In this context, external data flows are needed for measuring the attributes related to supply chain competitiveness.
(h) Forecasting accuracy (Trkman et al., 2010; Christopher and Holweg, 2011): The supply chain manager needs to conduct forecasting of supplies, demands, and market trends for planning the Supply Chain Management strategy. The forecasting accuracy can be enhanced through continuous flow of data upstream and downstream feeding to the mathematical, statistical, or advanced planning and artificial intelligence tools.
The above analysis affirms the value of continuous data flows, analysis of data streams, and continuous performance measurements. However, are the traditional data analytics systems capable of ensuring it? The traditional data marts and data warehouses do not have the capability of continuous data collection because the process of data extraction from transactional systems, data transformation, and data loading are manual and executed periodically. This limitation is solved by big data analytics and the associated advancements in analysis techniques with induction of artificial intelligence. The next section presents a review on the added value of big data analytics over the traditional data analytics methods.
2.4 BIG DATA ANALYTICS
Big data is not only about data flows from transactional systems but is also for data flow from a variety of other systems needed for advanced analytics, like market databases, communications databases, flowing e-mails and chats, scientific databases, running industrial and supply chain equipment, social media interactions, images, audios, videos, wireless sensors, radio frequency identification (RFID), and Internet of things (IoT) (Maier, 2013; Sagiroglu and Sinanc, 2013). The data collected may be unstructured, semi-structured, and fully structured, and the velocity of data may be real time or near real time (Sagiroglu and Sinanc, 2013; Elgendy and Elragal, 2014). The data volumes have reached as high as exabytes which is equivalent to one million terabytes and zettabytes or one billion terabytes (Sagiroglu and Sinanc, 2013; Elgendy and Elragal, 2014).
Big data analytics is characterised by five Vs. Firstly, by its massive Volumes of data collection and analysis, then it’s collection from a Variety of sources in unstructured and structured formats, also by the speed or Velocity in which data is collected and analysed, by the quality, availability or Veracity of data as it represents the facts accurately, and lastly, by its relevance and of high Value to the discipline dependent on it (Zicari, 2014).
Big data analytics is a system of collecting and analysing massive scale data from variety of sources for collective representation of knowledge (Jeble et al., 2018). For effective TQM in SCM, there is a need for transitioning from the existing batch data collection and analysis systems. Data collection needs to be in the form of continuous flow of streams and the analytics need to be automated, intelligent, and advanced. Big data analytics serves this requirement to a good extent.
2.5 APPLICATION OF DATA ANALYTICS IN QUALITY ASSURANCE of SUPPLY CHAINS
Data analytics, evaluation, and interpretation form the final stage of the process of knowledge discovery (Maimon and Rokach, 2010). Data analytics and presentation of final results are conducted after building the data marts or data warehouses by following the pre-processing, cleansing, transformation, and loading of transactional data (Maimon and Rokach, 2010). The analytics techniques may involve advanced statistical analysis, neural network analysis, support vector machine (SVM) analysis, Bayesian networks analysis, decision tree formation, survival analysis, and an instance or scenario-based analysis (Maimon and Rokach, 2010; Han, Pei and Kamber, 2011; Tuffery, 2011).
The primary objective of data analytics is to support data and knowledge-driven decision-making (Maimon and Rokach, 2010). This entire decision-support framework is used for integrating multimedia data sources of voice, data, and video (Zhang and Zhang, 2010), monitoring of patients and prediction of diseases (Lavrač and Zupan, 2010), predictive analysis from information patterns (Singh, 2010), forecasting and predictions in the financial world (Kovalerchuk and Vityaev, 2010), customer relationship management (Thearling, 2010), target marketing (Levin and Zahavi, 2010), prediction of intrusions (Singhal and Jajodia, 2010), and several other applications.
The applications reviewed in this section are focused on business and supply chain quality and performance as their interrelation is established by existing studies (Li et al., 2006; Cao and Zhang, 2011; Mutuerandu and Iravo, 2014). Business quality and performance requires financial, market, customer relationship, and supply chain predictions and forecasting (Ngai, Xiu and Chau, 2009; Kovalerchuk and Vityaev, 2010; Levin and Zahavi, 2010; Thearling, 2010). Financial predictions and forecasting require intelligent analysis of multidimensional time series data and patterns of the financial variables reflecting the financial health and performance of the business (Kovalerchuk and Vityaev, 2010).
Market predictions and forecasting provide insight into the dynamics of the target marketplaces, like changes in customer preferences and buying behaviors, changes in competitive scenarios, changes in prices, changes in market segmentation and clustering, and changes in products and services offered (Levin and Zahavi, 2010). Predictions and forecasting in Customer Relationship Management (CRM) provide insight into the dynamics of customer identification, attraction, development, retention, and losses (Ngai, Xiu and Chau, 2009; Thearling, 2010). Predictions and forecasting in target markets and customer relationship management (CRM) help in improving the effectiveness of campaigns and promotions (Thearling, 2010).
Knowledge is uninteresting for businesses if it is not actionable (Luo et al., 2008). For knowledge to become interesting for businesses, it needs to be matched closely with the objectives and the subjective perspectives of the business processes (Luo et al., 2008), Hence, the domain and background of the business needs to be matched closely with the day-to-day data analytics and presentation (Luo et al., 2008).
The big data mining and warehousing processes of data extraction, transformation, and loading are fully automated (Sagiroglu and Sinanc, 2013), although the challenges of technical accuracy and reliability are still part of ongoing research on big data analytics (Jagadish et al., 2014) . The MapReduce data analytics tool breaks the analysis job into multiple small pieces that are processed simultaneously, and their outcomes are integrated later (O’Leary, 2013; Elgendy and Elragal, 2014). Oracle’s big data architecture uses Hadoop that comprises of capabilities for extraction, storage, management, integration, processing, and advanced statistical and mathematical analytics with in-built artificial intelligence (Helen and Peter, 2012; O’Leary, 2013).
While some research studies have glorified big data analytics, others are cautious in setting the expectations (Jagadish et al., 2014; Kwon, Lee and Shin, 2014). The study by Jagadish et al. (Jagadish et al., 2014) discussed many technical challenges that are pending and studied in the ongoing research studies on big data analytics. Challenges from the heterogenous nature of data and the conflict of data analysis programs homogenous nature of requiring data to be structured before processing, to the rapid increase in data size increasing at an exponential rate with the slow growth in technological improvement of processors lagging behind which poses a challenge to the timeliness and storage.
Mél & Hogan (Hogan and Shepherd, 2015) raised the popular challenge of data privacy regarding the thin line between personal data privacy and security. Secondly, the challenge of who data belongs to was deliberated as regarding digital copyright. Kwon, Lee, & Shin (Kwon, Lee and Shin, 2014) cautioned about data overloading because of accumulation of useless bytes and suggested a model for employing big data analytics in business, which advocates careful planning of perceived benefits of data usage in both internal and external capacity, consistency and completeness of data, and facilitation and costs of resources needed. Hazen et al. (Hazen et al., 2014) added the additional dimensions of accuracy and timeliness of data sources for using big data analytics in business.
Keeping the expectations within feasible boundaries and carefully selecting the sources of data, big data analytics can have clear value addition of traditional data analytics in ensuring SCM quality assurance and performance. As studied in the SCM TQM model, the role of data analytics is to conduct predictions for planning in advance and forecasting in multiple operations areas of the supply chain. In SCM, the key applications of big data analytics are real-time and accurate data mining and predictive analytics, multidimensional data visualization and analytics from running processes, time series forecasting and econometrics, market and competitive intelligence, supply chain modelling, operations research, and total quality assurance (Waller and Fawcett, 2013; Ittmann, 2015; Schoenherr and Speier‐Pero, 2015).
Big data analytics can also ensure real time monitoring and analysis of costs of processes and their activities, and management of risks to the committed objectives through real time insight into supply chain dynamics (Schoenherr and Speier‐Pero, 2015). Real time insight into the dynamics of all the echelons of a supply chain can be achieved through continuous data feeds from the running IT systems executing the tasks and the equipment enabled with sensors and correlating with the linked data in existing batch updated database systems (Demirkan and Delen, 2013; Robak, Franczyk and Robak, 2013). The decision support system (DSS) formed is service-oriented with real time data, information, and analytics available on demand (Demirkan and Delen, 2013).
With the background knowledge of TQM in SCM and role of big data analytics in SCM, a critical discussion on the possible role of big data analytics in performance measurements and quality assurance of supply chains is presented in the next section.
2.6 Big Data Analytics in Performance Measurements and Quality Assurance of Supply Chains
The traditional solutions of data marts and data warehouses with the data analysis systems associated with them are not as efficient as big data analytics. If there are dedicated teams for data collection or extraction, cleaning, transformation, loading, and analysis, then the quality assurance is in batch mode with delays and lags, and hence cannot be fully efficient and reliable. The system of data collection and analysis needs to be real time and automated.
From the review in Section 2.3, it is revealed that TQM has a significant scope in SCM. The outcomes affect all the echelons from the suppliers upstream to the end customers downstream. There may be loads of processes running at the echelons, between the echelons, and cutting through multiple or all the echelons. A supply chain manager will barely have a visibility into the ongoing performance and quality of processes without continuous flow of data and automated analytics.
Big data analytics can ensure automated and real time data capturing, automated data structuring and presentation, and automated data analytics for service-oriented on demand decision support. This system can be very useful for the lean six-sigma quality system reviewed in Section 2.3 for defining value stream, automated data collection, continuous analysis, and perfections through continuous improvements of the SCM performance variables. The market and supply chain dynamics will not affect the accuracy, timeliness, and reliability of decisions made. Further, decision-making may be delegated to officials down the hierarchy unless specific cases requiring manual interpretations and analysis are found.
Lean six-sigma is quite difficult to manage with batch mode data availability after unavoidable delays. The quality system will be effective only when there are lesser dynamics and changes in the market and supply chain echelons. In highly dynamic industries, they may fail to meet their objectives of continuous improvements and reduction of wastes. Customer satisfaction may also become a variable amidst variations in their expectations. Hence, real time data collection and analysis may be the only solution for effective quality assurance through lean six-sigma philosophies, and in meeting all TQM goals. Based on the reviews and exploration of variables in Sections 2.3 and 2.4, the theoretical framework for primary research is finalized and presented in the next section.
...(download the rest of the essay above)