
Lessons from Building a 2,700-Person Analytics Team
Summary
Transcript
Dave Frankenfield, VP Enterprise Data and Analytics at Optum, oversees a team of 2,700 data professionals. How do you structure a team of that size? What functions does it cover? And how does it collaborate with and deliver value to the rest of the company?
In this episode, Dave discusses the strategies he’s used to build his team, the lessons he’s learned, and the advice he has for data science leaders scaling teams of any size in the enterprise.
The conversation covers:
- Building an analytics team from the ground up
- Approaches to managing shadow IT
- Tradeoffs between distributed vs. delegated data science teams
DAVE COLE
Hello and welcome to another episode of the Data Science Leaders podcast. I'm your host, Dave Cole. And today our guest is Dave Frankenfield. Dave, you have been a Marine, you started your career as a Marine. You're now the VP of Enterprise Data and Analytics at UnitedHealth Group and Optum. You've been there for a little bit over 13 years. The fun, interesting fact about your team today is that it's over 2,700 people. Welcome to the Data Science Leaders podcast. I really appreciate you coming on.
DAVE FRANKENFIELD
Yeah. Thanks for having me, Dave.
DAVE COLE
Let's just start with the elephant in the room, the 2,700 people. My understanding is you started your analytics team when it was relatively small. And now you have a very large analytics team. I can't imagine that all 2,700 are data scientists. They're probably a mixture of all the various roles that you need as a data science team. Do I have that right?
DAVE FRANKENFIELD
Yeah. That's a good way to look at it. It's kind of funny. What is an analytics team nowadays anyways? Amongst data science leaders, it's always a fun conversational topic about, what is analytics, what is ML, what is AI? But to your point of the 2,700 people, they're certainly not all data scientists. The way that we track for people that are really more aligned to the more sophisticated part of that trade. Real ML, real AI, it's about 650 of those individuals.
DAVE COLE
It's still a very large team. For most of the guests who come on the podcast, that's a massive, massive, massive team. Before we get into the details, I'd love to understand a little bit more about the team and how it came to be and your story certainly coming from the services, Marines, to where you are today.
The other topic that I think will be interesting in today's episode is, obviously we're talking about building the analytics team from the ground up. Another topic is shadow IT. Optum has, and you specifically have, a unique approach to shadow IT. If you don't know what shadow IT is, we'll define it just to make sure you know it because it definitely exists in your company if you're a data science leader.
The last topic is just the age old topic that we talk about quite frequently, which is distributed versus delegated data science team. Most people know what distributed is, but what does delegated mean? So those are the topics. Let's start from the top. So you started out as a Marine and then how did you get into the world of data science? How did you make that leap?
DAVE FRANKENFIELD
It's a fun story to tell. So there was a part of my life before the Marine Corps that I attempted to become a medical professional, medical doctor. Also, I was a college athlete and those two things just didn't work out so well for me at the age of 18. So I decided that I needed to go figure out what I wanted to do and if you're going to go, you might as well go all the way. So I went into the Marine Corps infantry and decided that that was going to be my way of finding myself at least for my adult life.
This all for me comes to a really interesting moment in 2001. Everybody in the military, particularly enlisted people, you get long weekends, you get a holiday, you get a bunch of people that are heading in the general same direction. You all hop in a car and you either get dropped off along the way, or you get driven home. And Thanksgiving 2001, I'm on my way back to Western New York. And while we're on the road, we hear on the radio that New York state's going to get about three feet of snow in 24 hours. They end up shutting down the entire New York state thruway system. There was a guy in the car that was with me from Virginia. We were coming from Camp Lejeune in North Carolina. And he said, "Well, why don't you just come home with me? You can spend Thanksgiving weekend with me and my dad and we'll just have a good time."
At that point, I'm up for anything. I'm on the road, I can't go home. So we're pulling into the Great Falls rest in the Virginia area. If anybody knows anything about that area, it’s a very affluent, nice area. I started looking at this guy, I'm like, "Man, what does your dad do?" He starts to explain to me that his dad's the postmaster general of the United States. You would never know it by meeting, having beers with this guy, that his dad is obviously, in a way, that famous. He has such a predominant position in the US government.
I spent the whole weekend with him and his girlfriend and my buddy. And of course, I had to ask, "How did you become the postmaster general of the United States?" He went to UNC Chapel Hill, majored in industrial engineering, and only got a bachelor's degree. Started working at, I think he said, like a three person post office somewhere in North Carolina and 30 years later, he's the postmaster general of the United States. So shout out to Ryan Henderson and his family and Mr. Henderson for having me that weekend.
But that really set me on my journey. I couldn't think of a better experience to say, "You know what? I'm going to go to college. I'm going to major in industrial engineering. And if it's good enough for the postmaster general, it's going to be good enough for me." This is obviously in the early 2000s. The vast majority of people would agree with me that data science as an industry didn't necessarily exist.
You had, particularly in other global markets, people taking engineers and then up training them in what was the applied mathematics space. I focused on operations research. There's always that bend to it, which is the grandfather of data science. And then I always like to joke, it was about that time period that math majors started to figure out that they actually have some sort of role in the commercial industry. Then all of a sudden, all the other majors just kind of took it away from them.
So during that time period I ended up going back to Western New York after getting out of my enlistment in 2002. I ended up going to SUNY Buffalo, a great engineering school. I'm still involved in the school up there. Extremely appreciative of my undergraduate education there. I started working on military problems. I started looking at swarm theory, autonomous UAV, routing optimization, that type of thing.
I always point out for people that really understand the space, they find this to be a bit kind of absurd. I actually turned down an NSF grant to continue that military research. I had a bit of an epiphany. It wasn't like I was working on things that directly killed people, but I certainly didn't want to work in an industry that was obviously in the greater military complex. That's where I tell people that healthcare found me.
I started focusing on trying to figure out what other industries or avenues I wanted to pursue. It was easy for me to get into the Six Sigma side of process improvement, industrial engineering. Those things are so entangled. They're pretty much the same thing. At least it was in the early 2000s. I started working in medical devices. Started working for a third-party medical device manufacturer, looking at TQM, Toyota production system type of approaches to what they were doing.
DAVE COLE
What is TQM?
DAVE FRANKENFIELD
Total Quality Management.
DAVE COLE
Got it.
DAVE FRANKENFIELD
It's kind of funny that you ask because I think only the older generations are familiar with that. It's such an older concept compared to where we are now.
DAVE COLE
Thank you for the compliment by the way. Even though we're probably the same age, but anyway.
DAVE FRANKENFIELD
I have that stamp as well. As I went through that journey and just to round out all the details, I ended up doing my grad work at Purdue. I started working with large provider hospitals. I started working with the VA hospital on some clinic optimization work. I worked with a county hospital, Wishard Hospital in Indianapolis. So I got to see a totally different dynamic there from a county funded perspective and what care delivery looks like. Eventually that's where UnitedHealth Group found me. 45 minutes up the road, still in West Lafayette and started working in their individual line.
So just to get to your core question, how did I get into data science? As I went through that journey, and if you look back at where data science was, the history or the progression into what is now the commercial industry of data science, it really didn't exist. It was really the moment right before it became a real industry. Being an engineer, as I went through my career with UnitedHealth Group, bringing in the management science, bringing in the applied mathematics, it was pretty easy for us. We were positioned in the right place. I was positioned in the right place to then start leading those endeavors and those investments for UnitedHealthcare at the time.
DAVE COLE
Well, you're making it out like, right place right time. But I'm a big believer that you make your own luck. Clearly, when you got in that car and you're the one who had the conversations, you were intrigued and then the big moment to me in that story is when you decided that you're going to school. You're going to leave the military and go back to school. There's an expense there, there's a big leap that you're making. The fact that you talked to the postmaster general and you saw how if you started, it sounded like at the ground floor in a small post office, and then made his way all the way up to the top.
When I look at your career, it's not too dissimilar. You started at Optum, looking at your title here, senior business process consultant. Sounds senior but maybe it wasn't the very entry level role but it was an individual contributor role I'm guessing.
DAVE FRANKENFIELD
Absolutely.
DAVE COLE
You were managing people right out the gate. But then you grew, you eventually became a leader.
DAVE FRANKENFIELD
Thanks for pointing that out. To be honest, I never really thought about the parallelism between myself and Mr. Henderson, the postmaster general there, but I would certainly agree with you. At the time, for people that are interested in this, for people that are still in, or maybe have exited the military recently, when I exited, there really weren't a whole lot of programs that were out there. Of course, you have your exit process and there's some professional based programs on how to prepare resumes and that type of thing. The GI Bill and it wasn't quite as much as it is now but we still had the GI Bill.
Of course, there's a lot of other organizations out there that are obviously really anchoring on getting military personnel into higher levels of the corporate organization. So certainly started off as an individual contributor. When I think about my time in Indianapolis and starting in the individual line, which in itself is different. It's almost like a separate organization even today from a structure, from a culture standpoint. It was an interesting time period. It was me and a ragtag group of five people just trying to help that business find value wherever we could. Very technical space group. A couple of engineers, a couple of business people that really understood the notion of how do you bring in technical perspective or mindsets to these problems for a solid year and a half. We were just in a way battling it out in that individual line and trying to get them to think more disruptively and like I said, trying to find as much value as one person could at a time, one project at a time.
DAVE COLE
One step at a time, one project at a time. Clearly, there's a lot of hard work and effort that has gotten you from A to B. One of the biggest steps is moving from an individual contributor and then becoming a manager. Was there any moment in your career where that happened? Did somebody just promote you and you became a manager? Or is there a story there that might be interesting? Did you actually actively pitch a project or something like that and then form a team? How did you make that leap?
DAVE FRANKENFIELD
Let me answer that with a couple of different components to it. Of course, I try to be a little bit humble about the notion of being lucky or timing, the people that supported me through my career, but of course, part of that is me choosing or recognizing that it was going to be a good environment for me to be successful in. From day one, when I joined UnitedHealth Group, the person that hired me and his manager, we worked together for almost a decade. The ecosystem was just perfect for me to be successful.
When I went through my journey, at first we had very specific things that we wanted to get done. It started as that process improvement, Six Sigma Black Belt OPEX type of role. And then as we were becoming a little bit more successful, creating a little bit more gravity of the type of things that we were doing. We ended up looking at UnitedHealthcare and made a transition personally and professionally and started looking at, well, what are some of the biggest operational problems that were there?
At the time, particularly for people that have been in the healthcare insurance industry, even today, but more so over the history of 10, 20, 30 years, growth in our industry was mostly by acquisition. And so we were still 32 different companies in the back office. If I had to pick a moment or a project, that certainly was one of the biggest pivots for us, because our mission in a very general way was, how do we start standardizing the back office? But for me, what I started to recognize was that this was a moment to start bringing in a lot of that management science, a lot of the engineering mindset, just the math into, how do we approach and solve these problems?
I did that for a couple of years and that led us to what I consider to be our seed investment into the advanced research and analytics team in UHC. So that moment over that couple of years, that mission of addressing the disparity in the diverse ways that operations were set up in the back office, then led us into what I would say is our first real commissioned effort from a data science perspective at UHG as a whole.
DAVE COLE
So you have 30 different disparate companies that have been stitched together. You saw a problem there. Imagine the problem was data standardization, the problem was you maybe had lots of different models out there that were essentially doing the same thing. There was little to no collaboration. You have pockets of teams. Am I right in that? The main project was just trying to eliminate some of the waste and increase collaboration.
DAVE FRANKENFIELD
Yeah. There were certainly — don't quote me on the number — but six, seven, eight different dimensions for the program. Some of them were fairly foundational. What's the standard metric system going to be in creating consistency? Let alone thinking differently about what productivity was. I'm sure there's still people out there that are in companies that look at productivity as piece rates per person or that type of thing.
We really took more of an IE based approach to looking at standard operational metrics. In a lot of cases really redefining those. So that certainly was a part of that. Starting from the ground up. If you're thinking about what I think is one natural progression of, how do you get into data science, particularly in a business like ours? You have to start foundationally.
Where it really starts is, how do you start modeling the actual organization itself? I mentioned before that I'm more of an OR person. I study more on the OR side of this So the notion of how you do the operational modeling from setting work standards and getting a lot more sophisticated about how you do that break or that work breakdown structure and modeling that. What systems do you need in place to essentially capture the different flows and the way that widgets or work objects are flowing through the system?
A lot of the time, those things aren't in place. Particularly back through the '70s, '80, and '90s, you didn't need to know how many times a work object went to different business partners or whatever. All you needed to do was sit at the back door and just watch the stuff come out, and then you would count that and look at obviously whatever your investment was by time or money. That's how you would manage your business. But we started really cracking that open, but I would say at an attempt to model the business at a deeper level.
And then that started leading us into the natural progression, which is then all of the planning platforms that we had around those operations. One of our challenges, which is still a curiosity of mine is, I talk to people from different companies in different industries. When you look at a company the size of ours, we can't have hundreds of logistics professionals in our business. Particularly, in the healthcare industry, what you see a lot in planning are domain experts that have been asked to start taking on workforce management roles and so on and so forth. And then eventually they might infuse one or two logistics professionals, some software, whatever that was. But for us and what UnitedHealthcare was trying to do at the time, it wasn't going to be robust enough.
So we went from the standard operating model work and started getting directly into the planning platform, which has an OR person. OR invented logistics, is what I always tell people. So very easy progression, and obviously we had a lot of success in doing that and really rounding out the operating model piece.
Just to throw it out there, because you asked the question about what were some of the components in that operating model standardization work. We also did a lot of work on soft skills as well. And to your question about leadership, preparedness in that and even going back to the military, we had a couple of different components where we were looking at change management, situational leadership and actually had the opportunity to go to all these different centers and actually train managers, not only on the science part of all this, but train them on leadership.
I look back at not only my professional career in the military, but my time playing sports. Obviously as I transitioned to the commercial side of my career, it's an interesting school or series of schools of hard knocks in that sense. I certainly think that, and I say this to my kids, one of the things about playing sports is that you just learn how to work with people. That dynamic certainly was true for me.
When I look at the military time that I had, I had a very successful career in the military, picked up E5 or sergeant in three years and as an infantry man, that isn't necessarily a given. You have to be pretty good to be able to do that and I learned a lot from that. Led large groups of individuals. And then of course, as you start learning what corporate management is. So now the question is how do you triangulate those three things? And that was an interesting experience and I still leverage that today. Sometimes I go back to some of the more aggressive or challenging dynamics that are associated with the military. Obviously, being a bit of a corporate leadership scholar, I'm a big fan of situational leadership.
So all those things were infused in the program as well. And they were running what we were calling the United operating model. That ran for about eight years up until a couple of years ago until they reached a level of maturity and then sometimes these corporate programs. They reached their shelf life. A big period for me and I'd like to think we had a huge impact on UnitedHealthcare at the time.
But then just to transition back, how did we progress into data science? If I look at that, where I left off in the notion of moving into logistics, then what people started to realize was that some of the math that we were bringing to the table, no surprise to us, was applicable in different ways or in different areas. We started getting into, how do you model quality control in different ways? How do you take different sampling approaches to obviously minimize the overhead, the administrative overhead of doing quality across claims and call and other types of functions in healthcare back offices?
We started looking at ways to model performance guarantees on contracts a lot differently using a lot of applied math in OR to predict what accounts were carrying risk. Looking at when we thought that they were going to start becoming risky enough to obviously, create or activate some sort of contingency plan. Things that are very similar that you see in financial services, the way that they look at portfolio risk and that type of stuff just didn't exist for us at the time.
And so after a lot of that success, I still recall the moment where myself and my manager and his manager, we all got in his manager's office there. And he looks at us and he basically says, "I'm going to give you $2 million to start an advanced analytics group. And the only request I have of you is to break even within the first year." I mean, how about that for support?
DAVE COLE
How did that play out?
DAVE FRANKENFIELD
Well, I'm still here so obviously that worked out pretty good.
DAVE COLE
Yeah. Obviously, it went well. So what'd you do? Hold on, I like to recap a little bit. Your story is a fascinating one. Clearly your background from the military, strong leadership qualities. But also you're focused on OR and your expertise and process improvements and being able to measure the quality controls and looking at work product and productivity. This OR background was applied to the internal workings of UHG.
Clearly you were successful and then one day this leader turns to you and says, "Hey, I want to build out an analytics team and I'm going to give you $2 million to go ahead and try this as a pilot. And I just want you to break even." Can you share what were some of the low hanging fruit that you tackled first to be able to breakeven? I imagine that was also, by breaking even it's looking at the ROI of the analytics that you were putting in place. So you were actually able to ideally tie, whether it be a model or what have you to some positive outcome and measure that.
DAVE FRANKENFIELD
Yeah. And some of the characteristics about what I'm about to describe actually tie into some of the contents or some of the agenda for our conversation today in the notion of the shadow IT comment. So keep in mind, all of this is now happening on the business side of our organization.
DAVE COLE
Got it.
DAVE FRANKENFIELD
So I'm working essentially in a technical role on the business side of UnitedHealthcare at the time, the gentleman that's funding this is funding this out of OPEX dollars. This is not a capital project. And at this point, there's no integration with the CIO. We're just off on our own little island and we're going to go commission this team. With that said, maybe the most direct answer to your question is that then, because this is on the operation side, the business side, we actually have a direct line of sight into a lot of operational problems that have real dollar values attached to them.
I'm sure a lot of data science leaders can see this or experience it. So that means that we're not just creating a team or eventually a bunch of solutions that are hammers hunting for nails. We already see the nails and now we're looking at, which ones do we pick that are essentially going to be our proof of concept? And so that certainly was part of the secret of success there.
The other thing too that when I look at that moment that was really pivotal for us was, trying to make a decision on how we were actually going to find talent. What was our kind of sourcing strategy, if you will, for talent? My manager at the time was also responsible for UnitedHealthcare's global resource and vendor management strategy. He did a lot of that work at a previous company, extremely well connected and networked. And so almost from day one, we were thinking of a global or multi-shore model to build out resources.
In the mid 2000s to early 2000 teams, India itself had emerged and established itself as a very strong third-party partner in the analytics space. You started to see companies emerge that were solely focused on analytics and not IT vendors that just happened to have analytics teams. So very good timing on being able to do that. And we also leveraged academic relationships at the time as well, which is clearly a big subject today amongst a lot of mature analytic programs in corporations.
But for us, particularly with obviously a limited amount of funding and wanting to do as much work as possible, we thought really creatively about how we are going to source talent, where were we going to get it from? In what type of arrangement were we going to obviously buy talent, or were we going to hire it? How did we work with international people that were at universities and graduating and didn't have homes? We found a lot of talent with people that were graduating with masters and PhD degrees, and they just couldn't find a job. So we were able to find a lot of top tier talent really quickly. And then, obviously get that focused on that one year ROI expectation.
DAVE COLE
That's great. It sounded like you went after a mix of folks coming straight out of college with an advanced degree. If they're having trouble finding a job, my guess is, the price point on being able to hire them was reasonable. Then you mix that with maybe more experienced services organizations based out of India who had done numerous analytic projects before. So bringing those two together and making that all work, obviously tapping into your management expertise is a really interesting approach and model.
DAVE COLE
You brought up shadow IT, and I did want to talk about that. So for those of you who don't know, shadow IT is basically, within companies, you have a formal IT department and their general approach traditionally has always been that they're going to the ones that define what software, what platforms the users end up using. And they're the ones who support it, they upgrade it, they maintain it.
Shadow IT is when you might, maybe you're on the business side, or maybe you're just a team that wants to take a different approach, wants to use a different tool, wants to use a different platform. Maybe wants to use the cloud when the corporate IT hasn't moved into the cloud yet. And you go ahead and on your own, without the blessing of IT, you purchase software and start using it on your own to build out your projects.
Correct me if I'm wrong, that's my understanding of shadow IT. But Dave, what has been your approach to, which I personally believe that shadow IT just naturally happens, especially today when it just takes a credit card really to stand up, to count on AWS or Azure, or what have you. It's very easy for shadow IT to proliferate. But what is your approach to managing shadow IT?
DAVE FRANKENFIELD
Let me first start with the way that you define that. First, I agree with the way you defined it and there's a couple of other ways or scenarios that the shadow IT label comes out. And an easy space to see this in is, I would imagine that as people look at how BI is done in your organization, when you go to look at who's managing the data market or the data warehouse for that Tableau platform or Power BI platform. And if that person's in the business, you can consider that to be shadow IT.
Keep in mind, if I put things into context into my journey. We started off more as an engineering team that I would then say evolved quickly into more of an analytics team. The way analytics is discussed or talked about as not being data science today. And so as we were doing that, we were starting to pick up and create capabilities around automation and putting them into my team. This was before the RPA market was established. So we were building our own bots out of either Java, C, .NET, that type of thing in a non IT way. So that was the first place that we got labeled as being shadow IT. But at the same time, because we weren't taking capital approaches to start sequestering data, building out the data assets required to support a lot of this analytics. We were obviously also investing in those skillsets before we officially started that advanced research and analytics team.
So what for me is interesting, and back to something I mentioned earlier, the notion of what is an analytics team? Well, I actually don't like to describe my team or the teams that I've managed in this space as analytics teams. I actually think as people hear us talking about this, and maybe they reflect internally about how their organization is set up, one of the things that differentiates us and one of the limiting factors of other teams that are labeled as analytics teams is that, it's so singularly dimensional that it causes issues that we've adapted to in our business model. Simply said, in any team, why would you want to depend on five, six different teams to get to value?
So in that case, we started putting in as many of those resources or skill sets into our team so that we minimize dependencies on other teams. So even today with 2,700 people, I have a large team that manages a lot of our legacy data warehouses. Things that are in DB2, Teradata, Oracle, et cetera, all the way then through a lot of the ecosystems in a shared service way for a lot of businesses where that data lands for them for, let's just call it intelligence pipelines, whether it's BI or analysis or deeper analytics, then all the way through all of those other services. We still have that integration layer, whether it's getting things into the back end of some of those assets or transactional platforms, or if we need to spin up some level of automation around those.
In that case, our approach to our business model wasn't just data science. It was the end-to-end pipeline or the full stack of those capabilities to minimize dependencies on other teams. Just to be a little bit more clear on my point, I see this internally. I'm not the only analytic game in town, I'm not the only analytic team in the business. But there are other analytics teams that are solely focused on the data science modeling that they have a lot of dependencies on other teams to go and procure the data. There's a disconnect between their ability to understand the domain that they're working in. And even if let's say they have some sort of breakthrough from a data science perspective, they have to rely on one, if not more than one group to figure out, how do I get this integrated back into the business process?
DAVE COLE
So you have that entire stack basically. You have people on your team who are responsible for what you call the integration, which I imagine is a combination on the backend, the data integration and the data engineering required to bring the data from the source systems into your various data warehouses, data lakes, et cetera. And also on the front end, being able to integrate into downstream systems so that you're putting models or analytics at the forefront of decision-making. And then you also have the analytic professionals or the data science. I know you brought that up a few times about the term analytics. I've heard it all over the map. Some folks will say analytics includes data science. Some people view analytics as being more of the descriptive analytics, or more of the BI type of approach. I've heard both on the Data Science Leaders podcast.
Anyway, so you have that all under one roof, which gives you a lot of autonomy in being able to deliver value. You're not dependent, as you mentioned, upon other teams, if you were just a pure data science COE, you'd be dependent upon somebody making sure that you can get access to the data or you may not have the ability to put your models into production so you might be dependent upon some IT partner team to be able to put those models into production and integrate them with downstream systems.
DAVE FRANKENFIELD
That's right.
DAVE COLE
That makes a lot of sense. That helps me better understand the large team that you have. But then when it comes specifically back to shadow IT, in a way that $2 million investment, on the business side, you were shadow IT in a sense.
DAVE FRANKENFIELD
Absolutely.
DAVE COLE
I imagine you were somewhat okay with it. Am I putting words in your mouth?
DAVE FRANKENFIELD
Well, we didn't call ourselves shadow IT.
DAVE COLE
Sure. Of course, you didn't.
DAVE FRANKENFIELD
Of course, I was okay with it.
DAVE COLE
Nobody ever thinks that they’re shadow IT. They think, no, we're doing something really great and I have a budget, I can do it.
DAVE FRANKENFIELD
There's a couple of interesting aspects of that. When you start looking at those skill sets that are traditionally associated with IT groups. Like the data engineers and application engineers, and even in some organizations like the RPA and automation folks. Absolutely. That's where the first, us being associated with shadow IT came from. I want to get to what I think is the most profound point on this and it's something that I've been preaching lately is, what the tech side of organizations have to realize is that shadow IT, where that money's coming from, where the sponsorship is coming from is a clear demand signal that there's a problem to be solved.
What we really need to be thinking about is how an organization's technology function looks at those investments as a parallel path to what IT might want to do more formally. And so in that case, it shouldn't be viewed as competition, it should be used as a short-term supplemental approach to solve those problems. What I find interesting there and this is where I find my team today, I look at it as a paradigm. If you talk to business leaders, what they want to solve their problems are services. They don't really care how it's done, they certainly care about the money. But if you can service their problem, they're happy. And what's awesome about services is that services are lean, they're fast, they're right to the point and there's obviously less overhead in anything associated with more formal capitalized types of approaches to solving problems.
As you shift on that paradigm, as you go to the other end of the paradigm, what you have is a platform or product place. And in that case, you're talking about, what I'm going to say, classic IT, associations of expensive, slow, extremely formal, hybrid resiliency in what's being created there but businesses can't wait for that process. And then somewhere in the middle as those services transfer over, you define a middle space's capabilities.
And so simply said, IT always approaches problems as a product or a platform play and businesses always approach them as a service play. And the downfall of that is that businesses are never incentivized to then ask themselves the question, after I establish the service that solves my problem, how far up the maturity scale should I be pushing this thing? There's no incentive there. Either IT comes down and hammers them to move it or the gravity keeps it down on the service side of that paradigm.
So back to the notion of, if we could stop thinking about shadow IT as shadow IT and think about the notion that the entire organization really needs to be looking at problems on this paradigm, then it starts to become a lot cleaner about how does an organization's product or platform roadmap work in complimentary to these classic shadow IT-type of endeavors. And what they should be saying is, "Okay, if I can see a path to value right now, why would I not go get it? If it's on the left-hand side of that spectrum, somewhere down the road, I have to force myself to answer the question, when does it get into the formal product roadmap, if it's required at all?" That type of a symbiotic relationship between IT and some of these shadow IT groups doesn't necessarily exist in a lot of organizations today. It's still viewed as two groups competing.
DAVE COLE
I completely agree. If the supportive approach would be to see if the business is standing up some system in the act of servicing your members, they're doing it for a reason. They're doing it because of speed, they're doing it for some reason. Then there's a signal there and you on the IT side, if you're seeing a similar system being stood up on the business side over and over again, that signal should make its way into the roadmap, whatever it may be. If it's standing up a machine learning platform or a data science platform, whatever it is. There should be a signal that goes back into IT that should be added to the roadmap and they should be learning from it instead of seeing it as a threat or seeing it as something that is being created separately.
So I can see that you're not a big fan of the term shadow IT because you see it as a potential positive, a potential way for the organization to grow up and be more supportive of solving the business problems in IT.
DAVE FRANKENFIELD
Yeah. I mean, I don't really get upset about the term shadow IT. If anything, it lets you know at least they're paying attention. But what's interesting is that if you start breaking down the life cycle of this. If you look at annualized product and platform roadmaps, how does that process work? Somewhere mid year, the year prior, you lock down funding, you lock down what you think is going to be the roadmap. For the most part they're fairly inflexible. In some cases, I've seen during my time here, sometimes that product backlog, the product roadmap is actually driving the business. It's the tail wagging the dog.
But to what we're talking about here, if you can view other investments off of that product or platform roadmap as a supplemental endeavor, particularly if you start getting into things like... That's where you start experimenting with your horizon two and horizon three concepts and ideas. That has to be off product roadmap and if then the things that actually become valuable should be feeding into next year's product in the platform planning process. And that's the way that that should work. And the good organizations have figured that out and some organizations still struggle with them.
DAVE COLE
So allow for that experimentation. That's where a lot of the innovation comes in and building that into your process, is what I hear you saying.
The last agenda topic is a distributed team versus a delegated data science team. So what do you mean? Distributed, to me, is, okay, so you can either centralize all your data scientists so they report into some DSL, data science leader, and they can be part of a center of excellence, the COE or they can be distributed and embedded into the various business units. You also can have dotted line relationships where, maybe they're dotted line into the business side, but solid line into some sort of a data science leader. But I haven't heard of the term delegated. So tell me a little bit more about your philosophy on the model that works for Optum.
DAVE FRANKENFIELD
Yeah. I'm going to try to, in parallel, talk about this in terms of the notion of what is the model and then how do you actually get there? Because the notion of where you start is actually more important than where you're trying to go and trying to do the change management or the pivots on there. So for us, first off, we operate in a very highly matrix environment. I view my team as a corporate shared service across all the different types of skill sets and capabilities that we have. So highly matrix in that sense.
To me, distributed means classically what we call decentralized, 100% decentralized. Where when you have delegated, it's more of that federated hybrid in between what is then the fully centralized model. When I look back at my history when we started that advanced research and analytics team at Unitedhealthcare, it made sense to be centralized. It was one of our first formalized investments in creating a data science group. It was a small team, small mission, and it worked.
But as time went on, other teams became analytics groups in that sense and what ended up happening was, we ended up highly decentralized. And I'm sure there are businesses that are out there that are experiencing the same thing for people that know the structure UnitedHealth Group, between UHC and Optum and all the Optum businesses. We're essentially six, seven different businesses under the UnitedHealthCare or UnitedHealth Group brand. And so all of those businesses were making their own independent investments.
Meaning that there was no alignment, no incentive for those groups to work together at the time because there was no COE-based approach or strategy. There was no conversation around standards in the analytics space. This is before the notion of data marketplaces and singular analytic platforms became part of the general conversation. So you just had pockets of really smart people doing pretty good work with no way to really know how capable we are in the data science field.
So in an organization like ours that's highly matrix, that essentially operates seven different businesses. What I mean by distributed is there's got to be some way to connect those teams together. And in this case, we have what I'll call a bit of a hybrid approach to having an analytics COE. Officially we do not have a chief data or a chief analytics officer which, like a lot of organizations, we make it more difficult for ourselves. But what that really means is that you have to figure out a way for those teams to start creating a community.
One of the roles that my team plays is, we're starting to pull that community together in a way or what I phrase as filling the chief data and analytics officer gap. The thing that we know we need to do to be a mature analytics organization without having one person be able to dictate all of that for us. I don't think it would make much sense for us to be able to do that, particularly at the scale that we're talking about. In general, if you think about data science at UnitedHealth Group, in the more broader sense of that definition, we're anywhere between 4,000 to 5,000 analytics professionals across the entire enterprise.
So there's no way that that's going to work well, particularly if you take what is a decentralized operating model in the way that you see those smaller, more agile teams operate, that doesn't scale. So that's why I talk a lot about the notion of distributed versus delegated and the notion of distributing or delegating those somehow back to a greater mission in the organization becomes really important, particularly at a scale like ours.
And what's funny about this is that, something that I certainly wanted to bring up and a question that I get all the time, particularly with 2,700 employees. To me, it's not a question of decentralizing or centralizing, what starts becoming interesting and why even at my team's size, how it works. One of our secrets to success is that it's not about the notion of analytics anymore, it's about the operating model. How do you operate differently at scale? How do you add enterprise value at scale?
And when you think about, as I laid out, that context of being seven different businesses, one of our opportunities internally, I draw a parallel to the way that the McKinsey Knowledge Center was created. I support almost every business in UnitedHealth Group. I can see across all those portfolios and business strategies. I can see the common capability, the common problems, the common solutions that those businesses are asking for. Even today, some of it is because of the way we need to be structured. UnitedHealthcare is asking for some of the same solutions that Optum is asking for. And you can't expect those businesses to organically work together to solve those problems in a singular way.
So we can play a really big role in finding those enterprise level common capabilities and being able to work with those businesses, bring those stakeholders together in a way that, it seems like a no-brainer, but from an execution standpoint. We all know it doesn't naturally always come about that way.
The other thing that I'll mention too is that, for me at scale, what starts becoming really interesting is then how do we start thinking differently about our internal structuring? As you're decentralized, as you're a small team, you're a small pocket of highly dedicated resources, small project teams, small scrum teams, that type of approach. But when you get to scale, particularly as I mentioned before, as we've layered in different types of skill sets and capabilities, we've actually created our own internal shared service model. We've created other roles just like you would expect to see from a third party or a consultancy or a client services role that essentially bring all the delivery leaders and all the capability leaders together. That's how my internal team functions let alone then how do we function with the rest of the enterprise.
As you do gain mass, there is enterprise value to be had, particularly if you're in an organization where you have a lot of divisions that are still fairly autonomous and mutually exclusive, you can play a very big role on not only just infusing the best levels of technical skill set and innovation in any given business vertical, but thinking more broadly across the enterprise. But then the trick is figuring out a way to operate because a lot of people say, "Well, as the team gets big,it starts taking on some of the classic problems of size, speed, expense, those types of things."
So that is my point of view on the notion of, centralized, decentralized, whatever you want to call the space in between. And so we're certainly in that hybrid model. But like I said, now, the second part of how I'm going to answer your question is, it becomes really important on how you get there. Unless you're literally the data science leader from the first dollar investment in your organization, then you're starting from one of those three positions. And now what becomes really important, not only which business model or organizational structure you want to get to, but what becomes really important is how you're going to work or incentivize all those teams to align to that model.
I can give you a bit of a general example on this. I've seen a couple of analytics COE attempts at UnitedHealth Group over the past eight years. And if we're starting in a position of almost 100% decentralized. Typically, how these approaches go is that somebody gets asked to take on the mission of creating a COE, they hire a whole bunch of good people in, they have a great construct. They went out there, they did their research, they pulled some gardener papers down. They know what an analytics center of excellence really looks like.
But then what ends up happening, particularly with COEs, you get into the funding side of this, where's the money coming from? And a lot of the time, they have to go and they have to allocate those monies from other people's OPEX budgets and that type of thing. And then they just expect people to behave in the right way. It's actually one of the reasons that I have a bit of a joke. I don't necessarily like the term center of excellence. To combat that, I always say, "Well, why don't we just start with a center of good first and then we can figure if we want to make it a center of excellence."
But in that case, I've had situations where I've had to pay for those centers of excellence. And what they didn't do was create a dynamic where the more I'm engaged with the center of excellence, the more I can influence the amount of money I'm paying to that cost center. And so those types of dynamics get missed a lot. A center of excellence for an organization, particularly in a decentralized org, makes a lot of sense. If you do your research and you can start figuring out what a center of excellence should be responsible for, almost as a wrapper around existing analytics teams, absolutely no problem with them.
Where I see this fail all the time is that you don't think about the nuanced mechanics and incentives of how you're going to drive behavior, which leads me to the last thing I'll say about COEs is that, another thing that I don't like when I get into COE conversations is that it automatically starts going into the notion of creating an organization. And depending on if you do have funding or not, the thing that I always point out is that the number one value proposition of a COE is that you're changing organizational behaviors across the community. Do you need to create a COE to change that? If you can identify what behaviors you want these teams to start demonstrating, do you need a COE for that? And so it creates a really interesting dynamic about how do you do a COE right? What's the value of a COE? Obviously success and failure and the dynamics that drive either one of those outcomes. So that's my little insight and point of view on centers of excellence.
DAVE COLE
I like that. If I were to summarize, that that's a good question to ask, which is if you believe that the best way to foster collaboration and upleveling of your data science and you can have a center of excellence for a lot of different roles within the organization, is to bring them all together formally and all report into the same VSL. One devil's advocate question is, are there other ways? Can you still have these teams be in a hybrid model or embedded into the business units, but foster collaboration in a different way without bringing them all together in one place?
Which invariably, when you do that, when you bring them together in one place, you might have great collaboration. However, they can lose touch with the, in your case, six or seven different business units and what their particular challenges are. However, what they could be good at is being able to look across those six or seven different business units and seeing commonalities, like common problems or common issues and being able to solve them in a much better way. So there is a balance there, but on the whole, a devil's advocate question of, is this really the only way that you can foster that great collaboration and thought leadership is a good one. And it's before you embark down that path of building out a COE I think, every organization should be asking that question for sure.
DAVE FRANKENFIELD
That's right. I do. Because in my own experiences, I've obviously seen more COEs fail than succeed.
DAVE COLE
If you've seen a couple, they can't have all been successful. Yeah.
DAVE FRANKENFIELD
Yeah. As I'm thinking about our conversation in the way that you just summarized it, the notion of centralizing or not centralizing is all about perspective. It's the same perspective about, should analytics organizations be on the IT side of the organization or should it not be within IT shots? And just to point out for me, I was business all the way up until eight months ago. I know when we first met, I told you about eight months ago, I made the transition. I brought my analytics team with me and we moved it into opt-in technology. And almost in a way to start fostering that platform-based approach across all of these businesses and looking at, then, how do you obviously think more on the formal side of delivering analytic product, analytic platform, that type of thing.
But to the point, if you are going to centralize, it's not analytics as usual, you really have to start thinking about the way your team operates and engages with the enterprise differently. I like to always point out and we all realize that particularly over the last eight years, the ability to run an algorithm is becoming more and more commoditized. What really defines success in these organizations are the things that we always overlook in analytics. What's your operating model? What's your engagement model? How are you going to essentially execute and deliver value? What does that whole process look like? Because almost in a way, and hopefully a lot of people would agree with me on this as well, doing good analytics is almost a given. Doing great analytics takes effort, but being a great analytics organization is more than just obviously, developing the next best algorithm.
DAVE COLE
100%. You can build a fantastic model but if it's not actually changing the way your business operates, if it's not actually tied into some downstream system and changing a decision from, my gut says we should go right but actually we should go left or whatever it is. Then it's not doing anything. So these insights, they have to change minds, they have to inform the future direction of a business unit or they have to improve a process in some way, shape or form for them to really drive value. And every VSL out there should be thinking along those lines before they just embark upon the next research experiment playing around with some deep learning model or some cutting edge new open source technology or what have you. So I couldn't agree more.
DAVE FRANKENFIELD
Yeah. And if you don't mind me, if I can kind of explore that idea for a second. We touched upon this notion of my team having a broad spectrum of skill sets back to the notion of being full stack more than analytics. One of the things that I always like to point out there is that, I've been in a couple of positions where my manager wanted to really isolate just the data science piece, in a way, move out all the other functions to other parts of the organization. And what I find really curious about that is that in this notion of really being focused on delivering value, not just delivering data science, to a lot of businesses, the ability to provide business intelligence and do reporting, that leads into answering questions, what we would call or the way analytics is talked about outside of data science. You have to do those two things to, in a way, earn the right to do the sexy data science stuff.
And not only that, but that experience actually gives you the right level of domain expertise so that you can actually think more abstractly or from an engineering mindset, but what do you really need to build in that data science or in that ML AI space? And this is a really important concept for healthcare. If taking a bunch of claim data and throwing an algorithm at it was going to give us a breakthrough in healthcare, we would have been there five years ago.
One of the biggest problems that we have is understanding the abstract nature of how health progresses, the fact that it's more of a lifestyle problem and the fact that what we have on our end is actually transactional data. So just think about that gap, that divide you need to bridge in the data science domain that isn't organically there from the way data scientists are trained and come out of schools today. You're literally looking at, let's say, a data model that is partially separated from the reality that you're trying to model. And if you can't bridge that, you're not going to create a viable, invaluable solution.
You can imagine that one of the hottest topics in our space has been and will be for some time the notion of predicting disease or predicting a modeling disease progression. And if you don't understand the comorbidity entanglement of these diseases in somebody's persona, you might be trying to model the wrong output. If you're looking at something like COPD, if you're looking at something like diabetes, particularly like appendage issues and you're modeling and trying to detect foot ulcers, that's probably too late. You probably need to be looking at other outcomes in that person's progression to actually create a space where your business can actually take action and have some influence on the way that that person's lifestyle and that person's health progresses.
It's a really abstract construct in healthcare and the most successful data scientists not only embraced the notion of doing the pre-work, the BI, the analysis, but then they start to understand that divide that they really have to bridge and whether that divide is bridged by tapping into the right level of clinical expertise, or if we're talking about middle or back office operations, or if we're talking about customer experience stuff, finding the right experts, putting in your time to learn the dynamic of that, and then asking yourself, how do I need to think about this data and how do I need to think about my modeling approach differently? Because now I have a better understanding of what this needs to get plugged into.
DAVE COLE
There's a lot to unpack there, I'll do my best. But you make a great point specifically around starting with that descriptive analytics or BI, or even to a certain extent in data science parlance that exploratory of data analysis, really understanding that data helps give you the credibility to be able to build the model and showing to your business counterparts that you understand the data, and then also having a conversation around it to make sure that you understand how the data is being collected, especially in healthcare. Because like you mentioned the comorbidity example, where you might be just focused on, I don't know cancer treatment or something very specific, and not realizing that, in this case, I don't know, 30% of your patients also have heart disease or something like that. Not understanding the full breadth, you'll miss something in the sexy data science that you're doing.
Also, one of the big challenges in healthcare is that you don't get everything. The data that you're working with tends to be somewhat messy. You don't know what's going on in the members' real world. You don't know if they're eating well or if they're not eating well. All sorts of things could be factors. They could be environmental factors or a whole host of things and you might not have data on it, and it might be critical to better understanding the level of care that that individual may need. And we would wish, we would hope that everything that's going on is being collected in some way to better be able to treat and give them a great experience. But it's just the nature of the beast. It's hard. There's just so much that can go on for somebody who's going through a medical situation.
Anyway, a long-winded explanation but I think it makes a lot of sense. And the closer you can get as a data scientist and working and truly understanding how the data relates to reality, makes a lot of sense to me. So hey Dave, we've covered a whole host of topics. There are a lot of great insights here. I really appreciate you taking the time. This has been a fascinating conversation.
DAVE FRANKENFIELD
And I appreciate it. I feel like we could probably go on for another hour, even in the general three, four topic areas that you laid out.
DAVE COLE
Absolutely. We had a couple of episodes in here at least. So Dave, I really appreciate you taking the time. Thank you very much for joining the Data Science Leaders podcast.
DAVE FRANKENFIELD
And just in closure, thanks for having me. And of course, I'm very easy to find on LinkedIn and other professional platforms. And if there was anything that piques people's interest here, I'm more than open to having direct conversations about either my experience or if people just want to chew on some of the concepts that we talked about today, I'd love to be able to do that.
DAVE COLE
Fantastic. I appreciate you offering that up Dave. Signing off here. Thanks so much for joining the Data Science Leaders podcast, Dave.
Popular episodes
What It Takes to Productize Next-Gen AI on a Global Scale
Help Me Help You: Forging Productive Partnerships with Business...
Change Management Strategies for Data & Analytics Transformations
Listen how you want
Use another app? Just search for Data Science Leaders to subscribe.About the show
Data Science Leaders is a podcast for data science teams that are pushing the limits of what machine learning models can do at the world’s most impactful companies.
In each episode, host Dave Cole interviews a leader in data science. We’ll discuss how to build and enable data science teams, create scalable processes, collaborate cross-functionality, communicate with business stakeholders, and more.
Our conversations will be full of real stories, breakthrough strategies, and critical insights—all data points to build your own model for enterprise data science success.