Potential: Revealed

Strategic Thinking, Innovative Ideas, Growth Marketing, and Revealing of Potential

Archive for data-driven

Big Data, Big Results

Recently a client of mine enthusiastically unveiled some fresh results from cross-sell marketing programs they were powering for a network of several dozen financial institutions (FIs).

Their enthusiasm was warranted: the results showed significantly superior improvements in impression-to-purchase ratios over traditional methods. The improvement in fact was over 3 times better!

Big Data Marketing Results Are Better

This is an excellent example of the implementation of Big Data in a direct marketing application.

Traditional methods that an FI uses for targeting (when they use targeting at all;  but that’s another story for another time) usually involve tabulating customer account data (e.g., who has a mortgage but doesn’t have a home equity line of credit?) and purchasing third party data for appending (e.g., demographic data based on zip code of the customer).

I like to call this “Who You Are” data. It is valuable but not rich and often out of date. It is often full of invalid indicators because all you can tell by looking at your account data for a customer is whether they have a home equity account with you … it doesn’t tell you anything about whether they have an account with another FI.

What’s needed is Big Data which is data that comes from many sources, is rich and voluminous (e.g., heavily sourced from transaction systems such as debit and credit card processing, bill payments, money transfers, ACH debits and credits, loan processing, etc.), and is handled by an analytic platform that can make sense of it and deliver it to a point of customer interaction or business decisioning -and if needed, in real time.

With Big Data you can add two key dimensions which I call “How We Behave” (which is a predictor of future behavior) and “What We Can Do” (which parses from the data what we can do based on our financial position and the trends in it). With these three dimensions a marketer trying to effeciently target customers with the right cross-sell offer has the insights needed to deliver superior results.

It is pretty cool! I’ll look forward to sharing more specific and detailed results when they are released soon.

 

 

Advertisements

From Insight To Action In Under A Minute

One of the toughest challenges most businesses, including and perhaps especially the many small financial institutions in the U.S., face is having all three key ingredients for being a successful marketer:

1. strategy and plans to use data to target their customers with relevant offers (to get from talking about it to doing it)

2. technology to create the targeting analytics and deliver the offers (to be able to make sense of the data and take action)

3. investment in skills and resources to sustain marketing efforts (direct marketing is largely a numbers game – you have to get beyond the one-time and piecemeal marketing that is highly ineffective)

I’ve certainly seen this with clients of mine and in listening to the experts in the area of direct and digital marketing, and data analytics.

At FinovateFall 2012 this week there continues to be an emerging set of companies that are providing ways for financial institutions to breakout – particularly with the first two ingredients. But I believe there is just one company, Segmint, that is also tackling #3 – by taking the required skills and resources and putting them into a box – or more accurately behind OneButton. Making it ultra easy to take action on what the data is telling you and reach the right customer at the right time with the right offer, in real-time wherever a customer might be: online, mobile or social.

So easy that targeted FI marketing campaign can be selected and launched in less than a minute.

All the work is done for the marketer, as long as they have #1, #2 and especially #3 are taken care of automatically by Segmint’s platform and solution.

See the post on Finovate’s blog from the Fall 2012 conference this week. I’ll also update you with a post in a week or so with the video Finovate will make available of the presentation by Rob and Nate.

Let me know what you think too!

 

 

Big Data Drives ‘Loyalty Trifecta’ for Banks

A good panel at this week’s Payments Connect 2012, including a client of mine, Rob Heiser, CEO of Segmint.

Check out the transcript of the panel discussion which is very informative, here.

Let me know what you think too!

Randy

Uplift Marketing

Recently we’ve been working on a simple framework for data-driven marketing (i.e., integrated, cross-channel, analytics-based, closed-loop):

Accumulate: • Accumulate data (multiple sources) • Integrate • Cleanse • Aggregate • Store

Synthesize: • Normalize • Match • Common Data Model • Single Customer View

Crunch: •Segment • Score • Peer Compare • Recommendations • Alerts

Publish & Execute: • Publish analytic outputs • Integrate to execution apps

Feedback, Improve & Repeat

An example of results has been impressive. ROI is a mere few months based on what we’ve seen.

More work to do and much evangelizing to propagate and get everybody doing it. But as suspected when we started, there is much potential that has been hidden and shows great promise of being revealed and realized.

Predictive Analytics: How it Works (#2)

In the first post about predictive analtyics we learned about the essential building block of predictive analytics: the predictor. This is a value calculated for each entity (say, a customer) who’s actions or behaviors are to be predicted – for instance the recency, in months, since a customer’s last purchase.

Prediction power is enhanced if you use more than one predictor at a time. In doing so you are creating a model. Models are the heart of predictive analytics. In this post I’ll discuss how you can find the “best” predictive model. I put “best” in quotes because from a practical standpoint, unless you assume unlimited time and resources you may be best off finding a model that improves your results (e.g., reduction in customer churn) over previous experience. Today there is available very powerful modeling software and well-trained and talented statisticians, but the number of variables to consider in any predictive model (across demographics, transactions, behaviors) can be extremely large making determination of the “best” model cost prohibitive.

Fortunately, taking an incremental, continuous improvement approach can yield solid results for most any business and the promise that results will improve over time. A common tool is to develop a yield curve. For example, plotting the results of a predictive model for churn with amount of churn on the Y axis and percentage of customers contacted in a retention campaign on the X axis will show a curve the decreases to a point — i.e., up to a certain percentage of a universe of customers contacted, attrition rates will fall — but will bottom out and then move upward. Meaning that not all customers will respond to a retention campaign and you are best off contacting only those predicted to respond well. After that point, you are best leaving the balance of the universe of customers alone – either because they are not likely to churn anyway or because the predictive models say campaigns to retain them will be unsuccessful (and possibly other methods are needed – along with models that might predict how these approaches can be equally tuned to expend effort on just those predicted to be successful).

Now, although the model does not work perfectly, the socring and ranking of customers according to their likelihood to be retained provides clear guidance on how to invest in retention programs to yield the best results. It will prevent campaigns to retain customers that are too aggressive (trying to retain those that are not likely to respond positively, or wasting effort on those that are likely to stay).

There is a great deal more to predictive analytics than I’ve covered in the past two posts. But I hope one message is clear: you can gain practical improvements in marketing results or other customer touch points through the use of analytics that don’t need to be complex (at least to start) nor perfect. Commitment, willingness to experiment and continuous improvement are what’s really required.

Thanks for reading and I’ll look forward to comments.

Data is not the plural of anecdote

Following the recent election season I heard a pundit comment that, as usual, winning candidates from both sides had mastered the art of successfully positioning themselves – and their opponents – through powerful use of anecdotes. They found anecdotes that resonated with the electorate and used them to either effectively portray themselves positively or their opponents negatively. The power came from repeating these anecdotes in speeches, campaign literature, political advertisements, and those much-hated automated campaign telephone calls such that people began to believe them simply because the repetition gave them an air of being factual.

Now, I looked up the definition of the word anecdote. An-ec-dote \ˈa-nik-ˌdōt\, noun, “short account of a particular incident or event of an interesting or amusing nature, often biographical.”

I wasn’t sure how this could be so powerful – sounds sort of innocuous. Then I looked at synonyms of the word – I often find synonyms to be interesting perspective on word definitions. Here’s what I found:

Story
Tale
Yarn
Fish story
Fairy tale

Ah ha! Now I get it – tell a story that is rooted in some specific truth but with an edge of humor and human interest, repeat it often enough and it becomes accepted fact!

Now, I do NOT want to make my blog into a political one. I use the above to set up a simple point that I think is important in personal and business situations and has nothing necessarily to do with politics. Often in my business career and in my consulting work in the area of data-driven decision making (for strategic planning or in marketing), I have used interviews with an organization’s associates and executives to get a baseline on the current environment from various stakeholders. Without fail, a common thing I hear is “we have lots of data, we are drowning in data, but we make most decisions based on opinion or conventional wisdom”. Probing a little further I find that what happens is one or a small set of facts become favored (sometimes for pure but often for political reasons) and then repeated and re-used until it becomes the rationale for many decisions.

A great quote I just recently found is: ““data is not the plural of anecdote”. I think I’ll use it going forward to help me make the point about breaking away from opinion-based decision making and moving to data-driven decision making. As in politics, we often fall prey to simply repeating – and believing – what we’ve heard before rather than demanding data-supported facts, particularly fresh ones and from multiple sources that clearly support recommendations and decisions.