Jon Fielding, Managing Director, EMEA, Apricorn, discusses some of the risks associated with third-party contractors in the manufacturing industry and the steps organisations should take to methodically improve the quality of their data.
The ongoing tragedy of the novel Coronavirus outbreak has underlined the fragility of many highly complex global manufacturing supply chains – a huge and multi-faceted issue that will require expertise and creative innovation from a broad range of sectors to completely solve.
Across the UK, Europe, and many other places, the first quarter of 2020 has been characterised by a range of unexpected parts and product shortages, stemming from the rapidly changing conditions and limitations appearing across multi-party supply chains.
UK supply-chain issues in consumer retail (supermarket supplies of toilet tissue) and even healthcare (sanitising and protective equipment such as masks and goggles) immediately spring to mind.
Similar issues have been highlighted during previous crises, such as the 2011 undersea earthquake and associated Fukushima reactor disaster in Japan. This catastrophe saw unprepared companies around the world – not just in Japan – suddenly forced to confront problems in quality control, customer service and other key components of successful 21st century businesses.
A relatively localised event set off a global ripple effect, damaging enterprises in many countries.
Building foundations of trust
A supply chain by definition has multiple moving parts. A firm can fail to meet market demand – even lose business – because just one of the links in the chain has failed. This can be down to a range of third-party issues, such as customer service weakness, slow or incomplete disaster recovery/backup mishaps, inaccurate/missing information, or data breach. Companies that plan in advance for potential supply-chain disruption, managing and mitigating these risks, are best placed to survive unexpected local events and even global consequences.
Each link in the supply chain presents its own web of risks. Organisations must have assurance on every link in the chain: they must be able to trust the data and information they receive and work with across the supply chain. Without accurate data to hand, few companies can develop suitable business intelligence to truly optimise performance and service provision.
There can be huge financial penalties for failures in this area. In the UK, for example, a data breach can mean a fine of up to €20 million or 4% of the company’s total annual worldwide turnover in the previous financial year, whichever is higher.
Fines can apply to any failure to comply with the data protection principle, or any rights an individual may legally have, or in relation to any transfers of data to third countries, according to the UK’s Information Commissioner’s Office (ICO).
Forging solid informational links
All companies should therefore ensure they not only identify potential future risks, but develop up-to-date, regularly revised systems, processes and a plan for related contingencies. This all sounds obvious, but is far from easy to achieve in practice.
There is no one-click, plug-and-play response that will fully protect an organisation, not least because organisational situations vary. However, there are certain foundational building blocks of data quality and informational assurance that every firm should employ.
As mentioned, a first step in the cycle of achieving and maintaining data quality is evaluating and prioritising all potential risks, including those related to third-parties. Data collected around this should of course also be kept ‘clean’ (uncorrupted), regularly updated, backed up and secured.
Formulating a workable company-wide policy on data integrity and assurance in light of the evaluated risks and priorities becomes the next task. Such policy should also take into account any differences that need to be applied to certain types or sources of information or data.
Whatever the policy, however, all data and information formats should be encrypted and backed up, with sufficient redundancy efficiently built into the system to mean that clean, up-to-date copies can be accessed quickly in the case of disaster, loss, or just questions about business process or performance.
This means incorporating all peripheral devices, too. It’s only too easy to neglect strategy around portable devices, but in this world of remote working and mobile technology, it becomes even more important to protect endpoints, including USB keys and drives, with 256-bit AES encryption as standard. These devices should be configured to meet central, corporate policy such as minimum password length and have function that allows secure, device recovery by an authorised user in the event of a forgotten password. Ideally, configuration can be automated, applied in bulk and securely stored for later search and retrieval to ease the burden on the administration team.
Data on embedded systems that play a role in automation and monitoring processes, ultimately helping guarantee efficiency and performance, should be considered too in policy and practice – even if they’re relatively ‘dumb’ pieces of hardware with little memory, no keyboard or screen.
In addition, flawed, corrupt or out-of-date data must be treated appropriately – disposed of or stored separately so as not to contaminate any analytic process or decision-making efforts.
Communicate and monitor to create a virtuous circle
Education of staff and clear lines of communication across teams and with relevant external partners at all times is also important. This is not only about establishing and sustaining ownership and buy-in around data policy, protection and practice, but also helps the organisation, at all levels, keep up with regulatory shifts, changing standards, compliance procedures and the like.
In addition, ongoing internal auditing and monitoring, with both processes dependent on quality data from each link in the supply chain, is essential. Unfortunately, there is no point at which the diligent organisation can claim the task of collecting, analysing and protecting data is ‘finished’; the data quality cycle is a dynamic one, with characteristics which will themselves shift and flex.
Manufacturers must be able to quickly locate, reassess and prioritise data from different links in the chain ‘on the go’ as circumstances demand. Only this will inform truly dynamic, agile decision-making that benefits the bottom line.
However, taking the steps outlined above should enable any organisation, small or large, to methodically improve the quality of its data – offering a real chance to not only reduce third-party risk but boost business performance, through better business intelligence, over time.