Our latest Data Briefing featured a Q&A with Barry McNulty, Head of Data at Hyde Housing Group. He reveals the impact of data and technology on the housing industry: the good, the bad and the difficult. We also cover Simon Blanchard's talk on safeguarding new data solutions and Robert Bond's analysis of privacy in a world of fast-evolving technology.
Before the rise of the blockchain as a ubiquitous presence in the world of marketing (or any other world, for that matter), there was big data.
Big data collection in a virtual data room, big data management, big data analysis, big data-powered everything. Marketers have slowly been losing their minds, always thinking they are not collecting enough data or doing enough with the data they had pouring in at a crazy rate.
This is not saying that big data has no place in marketing, on the contrary. However, often, marketers get so swept up they forget the marketing basics – market research, market segmentation, brand positioning and finding the right channels to market.
Now that the big data craze has eased off somewhat, most marketers are giving a collective sigh of relief. As they catch their breath, they once again see beyond big data.
One of the possibilities that is starting to come into focus is something not so different from big data – old data, or historical data, if you prefer. Marketers are once again beginning to realise that the abundance of data they have been collecting over the years holds far more potential than the fancier real-time data, which often lacks the quality needed to actually make real-time decisions.
Old data repositories
When we are discussing the major selling points of old data, we cannot ignore its sheer volume. Depending on the nature of your position as a marketer (in-house or agency) and the data collection methods you have used in the past, you will have a varying amount of data sources and repositories which will provide you with a varying volume of information.
At the very least, you will have historical data from Google Analytics and Google Search Console and this data can be massively helpful.
For example, our agency was recently hired for a ‘blog revamp’ campaign by a client whose 4-year old blog never really took off and was actually falling off recently (not dramatically, but it had become a trend). We first took a look at their historic GA reports (as you often do) and we actually noticed interesting trends in the traffic that had been coming to their blog.
We then spent an entire week closely analysing these trends (comparing them to historic Google Trends and other ‘old’ data) to discover the content that had (and hadn’t) worked for them.
We applied the insights we gained to guide our work on their blog and we actually increased their blog traffic by 300% when compared to the same period of the previous year. Just from old GA data.
Reducing the chances of error
One of the main reasons as to why old data can be extremely useful is the very fact that there is a lot of it and it spans a long time period.
Let’s say you notice something that could be a trend in the data you have collected over the last month, or something that you have been tracking in real-time for the last week, the margin of error is a large one.
For example, let’s say that you made a specific change to your email marketing campaign five days ago. You are tracking the numbers and they are looking great. You are thinking that this change you made must be the reason and that all your subsequent email marketing campaigns should be modelled the same way.
Now, let’s say that you have the data from a dozen email marketing campaigns, various A/B tests and other experiments you have been doing over the years. You decide to go back and analyse all of those against each other and you discover the common thread that is present in all (or at least a vast majority) of them.
It doesn’t take a data expert to tell you that the second scenario is more likely to provide you with insights that will actually provide value for your future efforts. There are just too many variables that can affect an email campaign that has been running for five days.
Real-time data has its value, but we also need to be aware of the fact that it often needs maturity and time-tested consistency to become sufficiently concrete and actionable. As it becomes more mature and grows in volume, the chances of error decrease for actions taken based on that data.
The value of cross-referencing
Focusing on a single data source and data type can never provide as much insight as data collected from a number of sources, properly cross-referenced.
For example, a marketing department of a B2B company kicks off a new campaign based on a new segmentation model and it is seemingly working amazingly. In the two weeks that the campaign has been running, leads are pouring in at an unprecedented rate. The new data seems to be showing this new segmentation model is a winner.
Then, after three months, the sales department comes out with its own data, showing that their numbers are not that great. In fact, they are selling less. The customer service department comes in with its own data, with their own list of grievances pertaining to the customers attracted by the latest campaign.
All of a sudden, this latest marketing campaign doesn’t look so great. The data has been coming from just one source (marketing) and as a result, it showed an incomplete picture.
With old data, this is far less likely to happen. As a marketer, you can cross-reference data coming from a number of different sources, more precisely compare the trends and patterns and more easily identify false positives that are difficult to avoid with real-time or too fresh a data. This is especially important for cases in which you are not working in isolation on a siloed marketing project.
Of course, cross-referencing data comes with its own set of challenges such as standardising data that is available in differing formats, addressing data duplication and prioritising sources when the data does not match.
The point of this article is not to malign big data or to say that you should stop finding ways to obtain new data; after all, without collecting it, it can never mature to great quality.
The point of this article is to remind marketers to look back at the data they already have at their disposal; look at it with fresh eyes and see what it tells them.
Have an opinion on this article? Please join in the discussion: the GMA is a community of data driven marketers and YOUR opinion counts.
Please register below to unlock this article.
An email will be sent to you with your membership details.