Almost a quarter (23%) of all the data organisations hold is believed to be inaccurate, according to our Experian Global Data Management Research 2016.
39% have 50 or more databases of contact data, up from 10% in 2014.
When it comes to data quality, 2017 will be a challenging year, with organisations across the board striving to address the challenges of open data and the pressures of new regulations.
Accurate and effective execution – particularly in marketing – depends on high-quality data. Today’s consumers are more demanding than ever, yet 76% of businesses believe inaccurate information is undermining their ability to provide excellent customer experience.
More and more businesses will increasingly look to use open data as a differentiator, adding new attributes to derive insight and intelligence on top of their existing base.
For those that have already reached that stage, the next step is releasing insights openly to foster innovation, new partnerships and social benefits. However, in order to be useful, let alone effective, the quality of any data utilised, harvested or shared must be assured.
Open data publishers will be put under pressure over the next 12 months to work with their users to ensure information is of a high quality, maximising the ROI of preparing, assuring, and releasing it.
Brands need to consider the processes they have in place, especially with the requirements of the upcoming General Data Protection Regulation (GDPR). Best practice is essential, but there’s a huge industry opportunity there for the taking.
For example, I recently spoke to one client who has held, and continues to market to lapsed customers and prospects.
Because it’s easy and non-costly when you have an in-house email capability. Never mind the bounce-backs. It’s deemed much harder to review and change their contact and data relevance policies, but now they are ready for change.
Marketers need to review the wider context of data both internal to and external to the business.
They must consider what insights they can safely share to improve efficiency, or help customers and partners.
Marketers could start looking at new sources of location or enhancement data to help with regulatory or business needs, such as risk, value creation or revenue-assurance.
Yet with consumer empowerment on the rise – as well as an increased focus on quality insights, data will most likely become smaller. My earlier example is a good case in point. That particular marketing database could be reduced by 75% once it has been standardised, pinned, de-duped and suppressed.
This may seem like a marketer’s worst nightmare, but it should be seen as a positive – less storage, better ROMI, less noise and happier customers.
The requirements for opt-ins, the removal of duplicates, the creation of golden records, and the proactive management of data held, will result in more accurate and consistent customer relationship management capabilities.
Marketing needs to embrace this data evolution, not resist it, because good data is better than big data.
As organisations, we should be thinking about what we can do serve consumers in areas where the web may not be at premium speeds, where mobile devices are their only access points, where English is not their first language, and where they may not have fixed addresses.
Thinking with this broader outlook could create unique opportunities in both existing and new markets previously thought unreachable. Even those ‘tier two’ markets, felt perhaps just too hard to reach, could be re-evaluated with a new data lens.
If an organisation can celebrate two things at the end of this year they will be that their data and business is both prepared for GDPR, and that they are using their data to contribute to a more open yet secure society.
Society is ready for this. All it takes is for organisations to be ready and capable of taking up the data challenge and leading the way.
The question for the willing is, how do you intend to proceed?
source – http://bit.ly/2q5Shok