Marketing Led Data Science

Abstract

Companies new to data analytics often give too much leeway to their data scientists. Ultimately, it is management’s responsibility to translate data science activity into meaningful business results. This marketing case presents how a marketing manager effectively led data science using four tactics: team building, using a test & learn methodology, holding weekly data science meetings, and hosting regular white boarding sessions. Using these tactics, the manager was able to transform an unprofitable $3 million per month program into a profitable $50 million a month program. The case suggests that most companies don’t lack data science as much as they do the management skills needed to lead it; that learning from data science resources is an effective way for bridging ideas to business results; and pacing data science with the business can foster increased productivity toward desired goals.

Article

Companies new to data analytics often give too much leeway to their data scientists. Data scientists are bright, ambitious resources who, in some cases, also have an entrepreneurial knack. So it is natural to let them get ahead of the business when starting out. But data scientists often lack the proper pace and organizational skills necessary to influence change for an entire program, process or product. Thus, it is up to management to translate data science activity into meaningful business results.

As an example, consider the leadership of one marketing director for a small consumer lending operation who leveraged data analytics to turn an unprofitable program into a wild success.

The manager, John, was responsible for a $2.5 million direct marketing budget. His marketing database included 15 million credit bureau prospects and 1 million cross-sell candidates. When he assumed his role, John’s group was primarily focused on cross-sell marketing per the adage that “your best customer is your current customer.” With this approach, the group was originating $3 million in new loans each month with an acquisition cost well over $1,200 per loan. To reach profit goals, the program needed an acquisition cost of less than $300 per loan.

To John’s surprise, there was an internal group of data scientists who had proactively built a suite of predictive response models for each direct mail campaign. Essentially, this group could predict response for each campaign, thereby identifying who were the ideal candidates. In practice, this would keep John from spending budget on non-responders thereby improving his bottom line significantly. Even more surprising to John: these models were not being used.

John immediately took the following steps:

1. Team Building

The first thing John did was call his program managers together with the data scientists. He did not introduce the data scientists as resources who simply “pull lists” but as peers helping to understand who the customer really was. John also asked the data scientists to present their predictive modeling results alongside the traditional marketing calendar to give insight on what campaigns to prioritize.

2. Test & Learn

The predictive models disagreed with the marketing calendar in their prioritization of campaigns. This was a source of tension for the group. The program managers did not want to risk performance by deviating from their plan; the scientists could not effectively influence the program managers otherwise. John neutralized the conflict by testing proposals put forth by the data scientists. For example, instead of giving full leeway to the data scientists over a 500,000 person campaign, he gave them 10% of the sample to test their model on.

3. Weekly Data Science Meetings

As results started coming in, the predictive models were more than doubling traditional response rates. This was exciting for the entire team. John hosted a weekly meeting with program managers and data scientists to track results and hear proposals on how to improve the models. Slowly, over the course of six months, John allocated 90% of the budget to modeled list selection. As was often the case, with each scientific proposal, there was a certain amount of resistance from the program managers as they would have to change part of their calendar or process to accommodate any given recommendation. John owned the conflict resolution effectively through continual testing and accountability.

4. One-on-One White Boarding

Understanding the creative need data scientists have for exploration, John would hold a bi-weekly white boarding session with top performers. During this time, they would brainstorm approaches for pricing, retention, customer service applications…the sky was the limit. With humility, John learned from the data scientists while actively bridging their ideas to meaningful business opportunities. In one meeting, it was proposed that the direct mail program should optimize its entire budget across the suite of predictive models. The idea was appealing until the results suggested allocating 75% of the budget to prospecting – a truly outrageous recommendation at the time. True to form, John tested the idea slowly at first, then opened up the throttle as results came in. Sure enough, an optimization algorithm showed that the lower cost, higher volume prospecting universe could substantially improve performance, and the results confirmed it. Eventually, John’s program was originating over $50 million a month at an acquisition cost less than $250 per loan.

John’s example is noteworthy for three reasons.

First, all the components for driving business value from data analytics were present before his arrival. Indeed, most organizations have sufficient data, some technical resources, and a use case for meaningful data analytics. What they lack is a leadership approach for leveraging those assets.

Second, John was not technical. He was an analytical thinker who thought strategically and delegated effectively. He was not threatened by the data science he did not understand, but instead used such opportunities to learn from others and, thereby, help develope them into business partners through his own education.

Lastly, John did not let data science get ahead of his group. Instead, he maintained a vision and set expectations in each weekly meeting; he then gave his intellectual and political support to the data science team to reach objectives. At one point, the data science team was getting credit bureau data 2-4 weeks later than what was ideal. John put together the business case for investing in an outside solution that on-boarded the data faster, thereby helping the data scientists significantly improve model performance once again.

Although it is tempting to treat data science as a purchased solution, it is not necessary nor advised. As John’s case demonstrates, the proper management approach can achieve meaningful business results through data analytics.