Data Analytics 201: Adding Value with Modeling Techniques

Author: NICSA

On Oct. 18, #WebinarWedneday enthusiasts tuned in for part two of a discussion on data analytics. The conversation picked up where NICSA’s July webinar, “Jump Start Analytics with Data You Already Have,” left off.

Jackie Noblett, Senior Reporter, Ignites, regrouped with experts from Albridge Analytics, DST Systems, Inc., SalesPage Technologies LLC and Franklin Templeton to discuss how asset managers can examine data, add value through modeling techniques and assess whether those actions impact the business. (NICSA members can access the replay the webinar here).

To get these projects off on the right foot, Lyndsay Noble, Lead Analytics Consultant, DST Systems, Inc., said that the application of analytics within an organization “can’t be done in a vacuum.”

“Strategy is very important from an analytics perspective,” she said. “It can’t continue to exist in small siloes across the organization. All of the people who are using the data should be working together under a cohesive corporate strategy,” she said.

David Lieberman, Vice President, Product Manager, Albridge Analytics, echoed Noble’s suggestion. “You need to ensure that analytics initiatives are closely aligned with the overall business strategy and how the organization creates a competitive differentiation,” he said. “As more data gets created, collected and unified across the enterprise, leadership has a real opportunity to ask better questions and leverage data as a competitive advantage.”

Lieberman said the “human element” serves as an undercurrent across these methodologies. “While the growth of analytics and other machine learning techniques are increasing in our daily lives, human judgment is still going to remain a vital element when making these strategic and operational decisions,” he said.

SEGMENTATION APPROACHES
RFM Segmentation
Noble focused on recency, frequency and magnitude (RFM) segmentation, a data-driven extension of the segmentations that many firms use right now.

“This methodology can be applied to any type of data for which these measures—recency, frequency and magnitude—make sense,” Noble said. Typically, this approach is used for segmenting on purchase behaviors, but it can also apply to engagement, portfolio diversity and other concepts.

“The data required for this methodology is really very simple,” Noble said. Once you’ve gathered and prepared your data and calculated regency, frequency and magnitude, look at the distribution of those variables, and create logical cut points. “After you’ve defined your variable, the next step is to bring your cut points together into distinct combinations,” she said.

Noble outlined best practices for performing RFM segmentation:
• Be conservative with number of cut points on each variable.
• Include your business experts in decisions.
• Don’t go too far back in time.

K-Means Algorithm Segmentation
Moving from a traditional channelized model to a value-based segmentation approach, Lieberman introduced attendees to the K-Means algorithm. “K-Means is a type of unsupervised learning where the goal is that the algorithm does the work on your behalf,” he said. “It finds clusters for you and defines the right answers, as opposed to a more traditional approach, such as regression, which is classified as a supervised learning technique.”

Lieberman said the approach doesn’t require advanced expertise in statistics or mathematics. “As a data practitioner, you’ll simply specify the input to the model and define the number of clusters,” he said. The goal of this algorithm is to define these organic groups by dividing records up into similar classes through an iterative fashion until you have a very compact fit of similar responses as well as differences among your groups.”

But how does one determine the right data to use? “A good segmentation variable is one that explains the variance in the use of your firm’s products and services,” Lieberman said. “The key point is the variables need to have some sort of differential response to your marketing or sales efforts.”

MEASURING SUCCESS
Greg Piaseckyj, Head of Sales, SalesPage Technologies LLC, said when it comes to measuring effectiveness, “we as an industry are still in our infancy.” From a broad perspective, Piaseckyi sees firms falling into three different categories, or states of maturity:

1. “The vast majority of firms fall into the category of still trying to get their data right,” Piaseckyi said. He warned against moving forward and assuming data is good enough: “Wait until you have it right,” he said. “The risk is just too great.”

2. Piaseckyi said a smaller set of firms have completed their due diligence and made final decisions on what outside data they think will be valuable and cost-justified. “Although they are well-positioned to utilize and reap the benefits of investments in this data, measurement success is still a ways off,” he said. “Wait on measurement strategies until you are more experienced, educated and confident in your strategy. Otherwise, you will find yourselves wasting efforts and measuring things that ultimately become irrelevant.”

3. The smallest set of firms have a clean data set and a mature strategy of how their sales and marketing teams are leveraging data,” Piaseckyi said. “They are now in a strong position to start asking and answering the questions regarding what to measure.”

DEMONSTRATING VALUE
Deep Srivastav, Head of Client Strategies and Analytics, Franklin Templeton, said that no matter what methodology you use, it’s key to demonstrate value “right from the beginning and through the lifecycle.”

Pilots are crucial in demonstrating value, Srivastav said. After you have identified an initial idea, he said, streamline the concept through a “test-and-learn” process that engages the distribution teams.

“That keeps the dialogue between the distribution world and the data world going, and it puts the distribution team in charge,” Srivastav said. “They are able to see there is an opportunity, and start guiding it right from the beginning.”

Eventually, you will need a more rigorous analysis of the impact. “For every targeted strategy, you need to have control groups,” Srivastav said. Be sure to account for similar sales, engagement and profiles, and test for statistical significance to determine the full story.

“Share the results with multiple leaders, and over time, you will build a lot of credibility,” Srivastav said.

NICSA thanks Northern Trust for sponsoring this webinar. To view the archived webinar for additional insight, visit here and be sure to share your thoughts with us.

 



NICSA: 8400 Westpark Drive, 2nd Floor McLean, VA 22102 • Tel: 508.485.1500 • Fax: 508.485.1560