Where Information Lives

EMC Journal

Subscribe to EMC Journal: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get EMC Journal: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


EMC Journal Authors: William Schmarzo, Mat Mathews, Greg Schulz, APM Blog, Cloud Best Practices Network

Related Topics: EMC Journal

Blog Feed Post

Transformation Play 1: The Performance Zone

In today’s uncertain political and economic climate, the ability to maintain business-as-usual should not be taken for granted. After all, an organization’s ‘status-quo’ business model is typically the source of more than 90 percent of its revenues and 100 percent of its profits.

It’s this “performance engine” that allows the ship to explore unchartered, potentially fruitful waters and steer its way safely back on course if things get choppy. Geoffrey A. Moore defines his concept of the ‘Performance Zone’ in his book ‘Zone to Win’.

performance

But striking the balance between keeping the engine room running to make the annual number and scaling a new, riskier, high potential market is notoriously tough. Even more so when, like so many of the organizations to which I speak, these organizations are desperately trying to play catch up to defend yourself against more agile, digitally-centric disruptors in their markets.

How to focus your defensive course 

To these organizations, I say both can be achieved but it is essential to focus your defensive course: specifically, manage any self-disruption carefully – you can’t cannibalize your core revenue too quickly. But eventually if you don’t cannibalize your core revenue, one of your competitors will. You also need to make the most of your ecosystem of partners that adds value to your established offering; by entrenching your partner network with you, they can help amplify your efforts. Critically, focus your R&D efforts on neutralization, not differentiation –  you already have the power of a huge customer-base behind you. You just need to develop an initial offering that’s strong enough to retain them, so that each of your follow on offerings get better and better at providing a compelling customer experience.

The key is getting to market quickly; take any innovative assets you are working on in the “incubation” zone and put them into service straight away and focus on integrating them into your current offer.

Using data and analytics to steer the ship more accurately

Ultimately the engine room relies on data and the analytics. It’s all about the thoughtful prediction and management of the numbers to drive gradual, predictable and stable growth. You need to know what your numbers are doing in order to respond more quickly to any disruptive changes in your market places.

To use a maritime metaphor, if an iceberg damages the ship, you need to be able to determine why, how and when to replace any damaged parts, change course (and if so, in which direction). 

The average company can only steady itself from this kind of disruption once a year. This makes it all the more important to be able to couple internal and external data with advanced analytics in order to forecast the unseen as accurately as possible and prescribe corrective actions. It’s crucial to identify, validate, understand and either correct or nip underperforming assets in the bud as quickly as possible.

Critically, it’s about anticipating these icebergs BEFORE they emerge – and converting the enterprise into a predictive enterprise instead of a reactive enterprise. The new wave of insight isn’t going to come from looking in the rear-view mirror but will come from predicting disruption and change by looking ahead.

To put this into context, let’s consider a bank.

The bank may well assume, on the basis of historical insight and traditional hierarchy, that it’s most valuable customers are the largest depositors. These customers may get VIP access to exclusive services, and may get preferential rates and discounts. However, deep analysis of customer behaviors might identify three, much smaller groups of depositors who due to their banking and credit card behaviors, are more profitable for your bottom line. It’s only through deep analysis of vast amounts of customer and operational data that you can identify and quantify these situations.

And so big data in this context can drive the organization’s “performance zone” by not only helping with customer segmentation and allowing for more effective targeting of products, servicing and pricing, but it can also help mitigate risk and further produce margin by providing agile, real-time insight into potential hazards.

For example, this deep analysis might flag customers who are at risk of default, but could also predict likely fraud activity across its payments network. Historically fraud prevention measures were delivered in broad sweeps. Oh, that credit card isn’t usually in Taiwan, let’s block a transaction from Starbucks on Zhong Xiao Fu Xing road. However, now they can be much more heuristic, learning from each transaction and developing personalized analytic or behavioral profiles that are not only better at picking up genuinely fraudulent transactions but better also at avoiding false positives.

Unlocking true customer segmentation and targeting

Everyone wants to better understand their customers and those customers’ behavioral tendencies. Historically they’ve had a one-dimensional view based on structured (RDBMS) customer databases. Now they have the potential to also capture, integrate and analyze semi-structure and unstructured data: e.g. a customer calls in with a complaint about something they experienced, you can cross reference that with their Twitter gripes about it, their location data, and provide a much more personalized response. For example, if a false positive fraud prevention block does happen for a legitimate credit card purchase, you could provide a 5% cashback payment on the transaction by way of apology in real time.

This helps minimize customer churn by providing a real-time incentive for loyalty by addressing challenging situations in real-time as they emerge. This drives huge savings compared to the cost of recruiting new customers and further boosts margins in the performance zone.

Enabling the performance zone

The key to delivering this capability organizations involves circumventing the everyday pain-points of infrastructure management. Organizations need to undergo a data modernization process, migrating information off legacy platforms, and removing siloed access to that data for analytics systems. In addition, the platform needs to be elastic in nature – not only allowing for scaling, but to provide a 360 view of the customer and that individual customer’s behavioral tendencies and preferences at any given point in time. The keyword here is ‘predictive;’ it’s not enough simply to monitor the past better; we need to predict the future with real-time analytics to give meaningful guidance to the business.

This is where the data lake enters the equation. By providing a holistic repository for structured and unstructured data that supports both deep and real-time analytics, and which scales performance with capacity, you’re able to build the kinds of intelligent applications that provide and exploit this insight.

Of course, the process of migrating your data into a data lake isn’t necessarily trivial; applications will need to be migrated and you’ll need to assess where and what data you want to ingest to deliver value to your analytics applications. This is where a consultancy process to map your transformation data may be key, and provide you a route from where you are to the nirvana of the performance zone.

When you combine the three – scale-out data lake capacity, high levels of compute capacity and a plan for mining the data for business value – the performance zone will be within your grasp.

 

The post Transformation Play 1: The Performance Zone appeared first on InFocus Blog | Dell EMC Services.

Read the original blog entry...

More Stories By William Schmarzo

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business”, is responsible for setting the strategy and defining the Big Data service line offerings and capabilities for the EMC Global Services organization. As part of Bill’s CTO charter, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He’s written several white papers, avid blogger and is a frequent speaker on the use of Big Data and advanced analytics to power organization’s key business initiatives. He also teaches the “Big Data MBA” at the University of San Francisco School of Management.

Bill has nearly three decades of experience in data warehousing, BI and analytics. Bill authored EMC’s Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements, and co-authored with Ralph Kimball a series of articles on analytic applications. Bill has served on The Data Warehouse Institute’s faculty as the head of the analytic applications curriculum.

Previously, Bill was the Vice President of Advertiser Analytics at Yahoo and the Vice President of Analytic Applications at Business Objects.