Finance firms must ensure they fix data fundamentals
By Wayne Parslow, Executive Vice President for EMEA, Validity
Faced with more legislation than most other industries, finance companies can find the various customer data management requirements overwhelming. From customer data storage rules like GDPR, to payment data specific laws like PSD2 and PCI-DSS, there are a wide array of regulation pressures when it comes to the entire financial customer relationship management chain end-to-end. In December alone, the ICO has issued two fines in the finance sector to OSL Financial Consultancy Limited and Pownall Marketing Limited for the inappropriate use of personal data. Keeping data accurate and up to date is undoubtedly a key regulatory goal for finance businesses and financial teams. Failing to achieve this could not only open them up to hefty fines, but also lose customer trust and ultimately harm their brand.
The chaos of this year’s pandemic will have thrown the data management of many financial organisations into disarray, putting function ahead of best practice in order to keep businesses running as normally as possible. However, if financial firms and finance teams stick to two key data fundamentals – data quality and data governance – they will be in a better position to navigate through the upcoming uncertainty in the post-Pandemic world, especially given the remaining question marks over the future of how Brexit will impact specific data regulations for the UK.
Quality and governance are core data foundations that any financial teams must be built on, both working in tandem to support each other. Not only would this ensure regulations are adhered to, but additionally, the value of data for driving successful business outcomes has been proven. For example, businesses that embrace a data-driven strategy are growing 30% year-over-year. It’s also an established principle that higher data quality delivers stronger and more engaged customer relationships. Furthermore, after only a year of GDPR coming into effect, email marketers experienced a whole host of benefits, including an uplift against all major KPIs according to the DMA’s Marketer Email Tracker report – marketers experienced increased deliverability (67%), open rates (74%), click-through rates (75%) and conversion rates (67%).
Data quality for financial firms is especially crucial given the regulatory landscape and the different types of customer data that financial customers provide. It must be remembered that data quality must be continuously maintained. It’s not a check the box exercise – it’s a complex network of processes and ongoing actions.
The first priority is to understand the current state of your data. With so much change taking place over the past year, organisations must reassess their data for accuracy, completeness, duplicates and inconsistencies. Proﬁling the data in this way will allow financial organisations to ensure that the data is housed in the right location, with a healthy outlook that is right for your business’ current needs – for example, ensuring it is easily analysed and reported on – and the simple yet often forgotten check of whether the data is in fact up to date.
With remote working having become the norm for most organisations over the past year, many businesses have provided security training about the additional risks. These include using a VPN where possible, securing your home router with a strong password, not downloading personal data to a home laptop, and more. Many breaches reported to the ICO are down to simple incidents like this, whereby someone downloads a confidential spreadsheet to a laptop which is then lost or stolen, for example. Evidently, standardisation of data is more necessary than ever to allow the data to move in the right way through the organisation – regardless of location. For example, if the ﬁnance team needs to see which deals are on track to close before the end of the ﬁscal year, or needs to produce particular reports based on the outgoings of a few different teams across different markets, putting best practice standards in place as simple as how titles and regions are entered, means those initiatives can be completed more easily and efficiently across the board.
Another barrier to great data for all organisations is duplicate data. With many regulations requiring companies to remove data under certain circumstances (e.g. time or contract termination), leaving duplicate data behind poses a significant compliance threat as well as consent issues. In order to consistently have a complete view of your customer data, organisations must be proactive with the continuous management of deduplication. With the right tools, it’s a simple yet effective process that can make a huge impact.
Enhancing your data
Once you have profiled your data, put standardisation processes in place and began to deduplicate it, the data is ready to be verified. It’s not just prospect data that would be useful to verify using external sources – provided, of course, that consent has been given for these external sources to be used in this way – it’s equally valuable to do this with information about your current clients. Providing more context on existing information with other key data points drives better understanding of the companies you already do business with and enables you to serve them better. Enriching data in this way also ensure organisations get a better ROI from marketing and sales campaigns.
Finally, continuous monitoring of the data is essential. Data is a constantly changing beast. A simple way to keep up with changes is to set up dashboards and alerts that track data quality automatically, which is key to ensuring your data remains healthy.
The good news is that if all the above processes are implemented correctly, tracking should be the easiest part. While it may seem like a serious undertaking, it doesn’t have to be with the right tools, and the rewards on offer are huge.
Adopting the right mentality
There’s no getting around it – a comprehensive cross-functional approach is needed to implement a successful data governance programme. For finance firms in particular, team members need to include subject matter experts that understand the various industry standards and regulations. Many finance organisations have an executive level representative responsible for company-wide data management and any initiatives, such as Chief Data Officer (CDO).
A key part of their data management remit should be to simplify processes where possible, with the help of the right tools and technologies for your organisation. There’s likely no single tool that will do everything a financial organisation needs, and every governance strategy should look different – specifically drawn up for your organisation. As is the case with “privacy be design”, companies should also be aiming for “data quality by design”. In other words, the checks and processes that ensure top-quality data is input and maintained becomes intrinsic, so they happen automatically without users needing to think about them. However, what works for every organisation is combining fewer steps for an end user to remember, and putting in place as much process automation as appropriate, will overall result in a more positive response from your teams to carry out best data practices.
NEWS3 days ago
IKEA stores owner Ingka starts on first New Zealand store
NEWS3 days ago
Vodafone faces German probe over suspected obstruction of 1&1
NEWS4 days ago
UK house prices fall by most since 2009, higher rates to bite-Nationwide
NEWS3 days ago
Hungary looking for ‘friendly’ co-investor to acquire Budapest Airport