We’re in the middle of the fourth industrial revolution. Industry 4.0 encompasses the use of advanced automation and analytics in manufacturing. So how is data science driving value in Industry 4.0?
Welcome to another episode of the Data Science Leaders podcast! I’m your host, Dave Cole. Today we have Paul Turner. Paul Turner is the VP of Industry 4.0 Applications for Stanley Black & Decker. Paul, you began your career coming out of college, you got your PhD in manufacturing and analytics from the University of Newcastle, and you spent almost the entirety of your career in manufacturing and analytics. Do I have that right? Anything you want to fill in?
That’s correct. Always based around manufacturing, but I’ve emphasized different areas, data science. I was a control engineer to start with, but yeah, that’s a good summary.
Great! In today’s episode, what we’re going to be focusing on is data science and manufacturing. We’re going to be talking about how data science is being applied in manufacturing. I’m going to learn a lot today. I hope you all do too. And also, I think we’re going to touch on the importance of change management, because I think in the world of manufacturing, and we’ll talk a bit about the various industrial revolutions that have occurred in manufacturing and what Industry 4.0 is.
But I can imagine how data science is being applied and getting folks on the factory floor to embrace it, to use it, to understand how to use the results of the models that you and your team are producing. There has to be some challenges there that you face, so we’ll talk a bit about that as well.
Great. First of all, what on earth is Industry 4.0? Enlighten us.
Well, it’s the fourth industrial revolution and if we go back 200 years, just a little bit older than Stanley Black & Decker itself, we started off in the first industrial revolution that most people are aware of, which was steam and water powered manufacturing. And then the second revolution was mass production around electricity and driving that.
The third industrial revolution that people may not realize was a revolution is the electronics age and computing and how that revolutionized the way that we started handling data and execute within manufacturing. And the fourth industrial revolution, which is this latest one, is really where advanced automation, analytics…and it’s about digital transformation. It’s not just about making things digital.
It’s actually about changing the way that people work by leveraging things like artificial intelligence, machine learning, and a whole host of other advanced technologies that are making how we manufacture different.
And I imagine, does IoT, Internet of Things, play into Industry 4.0 as well?
Yes. We for a while, and probably still are in places, we called it IIoT within manufacturing, so Industrial IoT. That’s what we call the foundational layer from a technology perspective. There are two elements for the foundation for any Industry 4.0 program. One is just getting the data, and data at the moment sits in many different sources and you need to bring that together, and IoT is part of the solution for that, to send the data in from the plant so that you can measure things.
And then also, there’s a foundational layer as far as what we call operational excellence is concerned and making sure that the basic way that you operate your manufacturing plant is lean and efficient already. Because if you apply Industry 4.0 on a process that’s slightly chaotic and maybe not as optimized for the lean perspective, then you’re just automating chaos. You get the foundations right and then that’s where you really deliver value as you scale up the pyramid of Industry 4.0 applications.
Got it. If the third industrial revolution was adding in electronics and computers into the manufacturing floor and improving the assembly line, that was Industry 2.0, so to speak, when electricity came in, then it seems like 4.0 is all around optimization and using algorithms and models to be able to make the manufacturing line self-healing in a way or increasing yields through the use of the data that’s coming out of the various machines on the floor. I don’t know, is that a decent interpretation of Industry 4.0?
Well, it’s certainly a large portion of it and we often when we’re looking at north stars, talk about lights out manufacturing where it’s all automated. However, I see that as being quite a way off. What Industry 4.0 really represents is this cyber-physical relationship, this relationship between technology and people. And that’s what digital transformation is and that’s why the change management you said before is so important.
And from my perspective, people shouldn’t fear Industry 4.0. It’s not about taking away people’s job and replacing it with robots. It’s about training, it’s about making jobs become careers. It’s about improving and empowering people within the factories to actually have more interesting careers because they’re moving from a manual way of working now to a digital way of working.
Right. There should be this great relationship between the data and the analytics that’s coming from that data and the folks on the manufacturing floor to make more intelligent decisions, instead of just finger in the wind, this is what I think I should do in this circumstance. Speaking of which, can you maybe take us through some of the problems that you’re solving today in this Industry 4.0 mindset? If I am somebody on the factory floor, give me an example of how analytics and data science is helping me.
There are many, many examples on use cases of what you could call Industry 4.0 out there within Stanley Black & Decker and also in other industries. And I could go on, I could talk about individual use cases, which I will do, but what I’d like to say first is that what a lot of companies are facing are pilot purgatory, where they’ve got lots and lots of different pilots, but it’s not scaling. And so, in order to manage how we approach Industry 4.0 different use cases…
At Stanley Black & Decker, what we’ve done is we’ve developed a pyramid approach to how we scale the value. I’ve talked about the foundational layer, which is basically getting the data and getting it in the right format so that you don’t have to go to 20 different databases. The next layer is a thin layer of self-service analytics, where we empower people in the plant to actually look at the data and use simple tools to look at relationships.
Then what we’ve done is we’ve taken different use cases that we’ve developed. When I first joined Stanley Black & Decker, we probably had 50, 60, we’ve got a lot more now. And in order for people to get their heads around it, we placed them into specific value-driving pillars. We have a labor performance pillar, we have a quality performance pillar, we have a safety performance pillar.
There’s about six or seven pillars and different use cases go in each one and the benefit of that, we could talk about video analytics around quality, which is heavily using artificial intelligence for identifying defects automatically rather than inspection teams actually going in and manually looking at it. That obviously fits under the quality pillar, and what it does is it focuses people’s attention on the value drivers, not the tech. Because one of the challenges of Industry 4.0 is, there are really three flavors of it.
One is the tech-driven approach where it’s all about tech. And I went to a major Industry 4.0 conference. I won’t name them, but they’re probably all similar, where it was all about the tech and value wasn’t mentioned at all. It’s all about the tech.
By value do you mean being able to compute ROI?
Really focusing on, hey, if you have a model that’s able to detect defects in the product that you’re producing and then you have trained your staff to be able to look at the output and to be able to react to it, over time you will improve the cost savings due to you will be able to lower the number of defects, the cost savings will be X over time, and things like that.
But it’s also about managing to strategically align the investments you’re making in things like AI and machine learning with the value. Because if you ask me, what’s your strategy for AI within manufacturing? They may talk about the cognitive computing platform and the architecture, and you could talk for hours and hours about that., But then the question here is, once you’ve got that, how does that relate to value?
What are you doing to improve labor efficiency within the factory? What are you doing to improve quality in scrap? What are you doing to improve material flow through the factory? You’ve got a really nice tool here, there’s some amazing innovations that have been done around AI, but that’s great and I want to build a model, but I need to relate it specifically to value.
I think I teased you with it on our previous discussions around some examples of why getting the balance between the tech, the value, and the domain expertise in the manufacturing knowledge is so important to drive success with science and manufacturing.
We will get there. You’ve mentioned there’s three approaches to what companies are taking to this Industry 4.0. The first was that tech-driven approach, but with not a lot of focus on the value, what are the other two? Then we’ll get to the examples.
The other two is the IT-lead approach, where we’ve seen a pure IT initiative, and it’s all about platforms and architecture and cloud and IoT, and that’s great. However, that’s just enabling, okay? And the challenge with just an IT-lead approach is it struggles to see the value. The challenge with a tech-driven approach, of course, is that if you’ve got a hammer, everything looks like a nail. You’ll use technology where probably there are simpler solutions that might be more appropriate.
And then the third flavor, which sounds I guess the most pragmatic approach, is the lean-lead approach, where it’s all driven by operations and manufacturing shop floor needs and wants, which is great because from just talking about it, but I see there are challenges with that approach as well. You’re certainly focused on the value over the two things. One is it struggles to scale that approach. You’ve got lots of local innovations, but it isn’t looking at it from maybe an enterprise view and so it’s difficult to scale.
And then secondly, of course, if you’re purely focused on the day-to-day needs of the plant, then you may not know what you don’t know. And you may miss that iPhone opportunity.
So, you’re more reactive, right?
More reactive. Yeah.
If you’re just reacting to the needs of the floor, and you’re not taking a step back and seeing the forest from the trees, so to speak.
Ideally, strategically, what you need is a balance of all three of those. And that’s what we’re trying to do within Stanley Black & Decker, is have this one team approach that involves IT and OT. This is the tech side of things, that close key path is the operations and lean manufacturing.
OT being operations?
Oh yeah, jargon. IT obviously is information technology and OT the operational technology. So things like MES systems, manufacturing execution systems, automation, PLCs, etc.
Right. Got you. Okay. The right approach is the blend of the three, so maybe talk to us a bit about some of those use cases and how you see the blend of those three showing up in some of the use cases that you’ve solved.
There’s another three-legged stool that I need to talk about in order to go into the specific use cases, because I think as far as the parts of the organization, and from the different flavors of Industry 4.0, I think provided you have a team where people from the different parts of that team are working together as one, and you’ve managed to eliminate the barriers and the silos, then you’re a team that’s really empowered to go and deliver value.
The second three-legged stool is the balance between data science itself, domain expertise, and a focus on the value that the solution will deliver. And the reason why this balance is so important is because manufacturing data science is unique and it’s different. It has many nuances that you may not see in other fields of data science, and that makes it exciting.
How is it unique?
I can give a good example that is counter-intuitive. In manufacturing, I have a use case where a model that is 5% accurate is actually more valuable to operations than a model that is 95% accurate. Now, to a data scientist, that might seem like an anathema. How on earth can a model that’s only 5% accurate be in any way better than a model that’s 95% accurate?
Well, to answer that question, you need the domain expertise. And this is an example, it’s a printing system and there’s inks feeding the printing system and they have some quality issues and maybe they have around 80% yield. This isn’t a Stanley Black & Decker problem; I’m just talking about previous experience. That 20% of quality problems, the bulk of that, 95% of that 20% is associated with single sensor issues.
Whether it’s a flow issue that one of the inkjets is getting clogged, or whether there’s a low level in one of the inkjets, it’s causing a quality issue. That quality issue tends to be, you can sell the product still at a 10% discount, maybe. Key thing is that the operators are already aware of that issue because there’s an alarm in the control system that will go off if there’s a flow problem or if there’s a level problem.
So if you already get the alarm, there’s nothing operators dislike more—maybe an exaggeration—but there’s nothing they dislike more than if they get multiple alarms for the same thing. There are too many as it is. Now, what we need is a model that actually fails to predict those 95% of the quality issues that the operator already knows about, but the 5% that they don’t know about, they’re the ones where the product actually has to be scrapped and they’re the costly ones.
We want a model that can predict that 5% that has the true value, but also not annoy the operator with these trivial ones that they already know about. And so, you can see how you could not put data in front of an AI model and train it without that context. And so, getting onto the plant floor and living and breathing with operators and the people who are actually going to use the predictions, is really important.
Right. To put it in my words here, if you have a model that is 95% accurate, but it’s predicting stuff either that you can sell at a discount where it’s predicting a defect that is not the end of the world. It’s a slight problem, but the product is still sellable. Or if it’s identifying things that the folks on the manufacturing floor can see with their own two eyes or they have other alarms or alerts that come from machine itself without the use of analytics where they can easily prevent it or deal with it. Then it’s not that huge of a value add.
However, if you have a model that is identifying, say it’s only 5% accurate, but 5% of the time it’s identifying defects that a human cannot see with their own two eyes, but also it’s identifying defects that are so bad that you can’t actually sell it anyway, you just need to scrap the whole product, then that can be incredibly valuable.
And then obviously you want to increase that 5% as best you can, because I’m worried about the 95% that it’s not identifying, but at anyway, by and large that provides tremendous value to the folks on the floor because they’re like, “I had no idea that this defect existed, but thanks to this model, it’s helping me.” Right?
And that’s where the value calculation comes in as well. What is the utility of this model and how valuable is it? Although that specific use case now maybe seems fairly obvious, there’s an important point in that and it’s around that word accuracy. Because a lot of data science research is about improving model accuracy and a lot of companies, startups and bigger companies that are much more mature, when they’re trying to present the AI and machine learning and analytics technology, they tend to focus on how accurate it is.
And actually, it’s further down my priority list when I’m looking at how useful a particular tool or analytic will be, because I’ve got another example where we’re looking again at trying to predict quality, we have a model… Actually before I say that, in manufacturing, quality issues in most cases tend to be quite rare events. It’s not that it’s happening every day on a lot of processes. And when some thing’s a rare event, a model that just predicts the quality will always be good is actually quite an accurate model. Okay?
If I got a 98% yield, and a 2% product quality issue and I got a model that says everything’s going to be great all the time, it’s 98% accurate, but its utility is zero. Now, because we have other statistics for focusing in on how well it’s predicting the thing it’s supposed to predict, but even that isn’t infallible because there is a—and this is that financial leg of the stool—there is a value of predicting something correctly and being able to act on it. But there’s also a cost of false alarm. And that relationship isn’t always linear. I’ve had example projects where the actual optimal model that we’ve used is a less accurate one because of that balance between the cost of a false alarm and being sent on a wild goose chase or annoying operators or actually if you’re assuming something is faulty and then you have to go in and figure out whether it is faulty and then you find out actually this wasn’t faulty, so there’s a cost there.
That balance and that calculation…And then one of the projects, we actually did a very detailed calculation to figure out what the optimal model accuracy was. It wasn’t the highest on any of the metrics that you normally use on model accuracy. Again, manufacturing nuances, you need those three legs of the stool. The data side, value, and domain expertise.
I think what makes manufacturing, analytics and data science…on one hand, I’m thinking to myself, this is somewhat unique because it’s this meld of humans and the output of the models and analytics, and you need to have this great relationship between the two which seems to be the underpinning of Industry 4.0, but on the other hand, it’s not just for the industrial world.
There’s plenty of use cases out in the non-manufacturing world where the outputs of a model need to be interpreted and need to be used, need to change behaviors. If you’re a bank teller and you’re talking to someone and you’re trying to pitch them on some new product that you have, you’re trying to pick which one to choose from. It’s a modeled output. Like, here’s the offer that you should be presenting to the customer, there’s some change management there too. It is somewhat transferable.
Is there a specific use case or challenge that you’ve solved in your career where you felt like that data science plus the humans interpreting it worked really well?
Yeah. That’s one of the big challenges and as you said, it’s not just manufacturing, it’s a challenge more generally. But specifically in manufacturing, factories are busy places. The whole concept of lean manufacturing is getting things done as efficiently as possible, which tends to mean that people on the shop floor don’t have a huge amount of spare time. And so, they don’t necessarily have time to sit down and analyze in order to act. They may need that done for them.
That’s why we talk about predictive analytics and we talk about insights, but actually the thing that drives value is the action. And if nobody takes any action, it doesn’t matter how insightful the model is, there is no value. And we have situations where we have sensors monitoring quality and when they detect that the quality is actually off and it’s producing scrap that there isn’t an immediate response from operations because they’re busy managing and doing other things.
So, having predictive analytics to predict scrap is great, but if they’re not even acting when the machine is producing scrap, are they going to act when the machine is not producing scrap but a model is telling them that it will produce scrap? That they won’t. However, as you say, we need to move to that state where operations is able to act and act efficiently. And so, what we’re building within Stanley Black & Decker is a platform to do just that.
Actually, one of the things we’re trying to do, we’re using data scientists to actually try to eliminate the need for a data scientist in that process of predictive analytics. We’re trying to automate as much as we can, right down to the actual prescribed action that operations needs to take. And then once we’ve got that prescribed action, then actually managing the workflow to ensure that action is taken, so we think of opportunities within manufacturing as leakages.
If there’s an efficiency in a plant, we call it a leakage, because we can put a dollar value on that. When people can see the dollar value, it’s literally like dollars that are leaking away and this can accumulate and as it accumulates, it becomes more important. And by providing insights and leaderboards around the leakage rates, providing automated workflow around escalations and giving people metrics around this opportunity handling, that empowers operations teams to prioritize more intelligently and smarter.
And work on the areas that the analytics has figured out are actually the high priority areas, because they got the maximum leakage rates and the biggest opportunity. We’re trying to combine AI and analytics with workflow automation, and then obviously the people in the loop in order to try and incentivize this process. And we’re getting pretty far with that, and we have some great examples where we’ve automated right up to the point of prescribing action. But if we can’t prescribe action and the action is to go and do what we call a kaizen event…
What is that?
A kaizen event is a continuous improvement process or procedure that teams go through in order to troubleshoot root cause issues on the plant. But we provide the teams with all the data they need to do that kaizen process. And we’ve been very successful with that, but what we’re trying to do now is go to the next level and try to automate the kaizen as well in order to get to action as quickly as possible because the driver for us from an Industry 4.0 team, is how do you get from data to action to value as quickly and as efficiently as possible? And technology is something that can help us do that.
There’s a lot of good stuff in there. It sounds like what you’re working towards is you have a data science team, they’ve built models to simplify it, to predict, say a defect or predict an issue, and then what you’re trying to do is make sure that not only are you able to…obviously you want to do your best to increase the accuracy for predicting events that cannot be seen by just the strained eye but are helpful and additive to the folks on the floor, but then not only that, right?
You also want to be able to prescribe a specific action to take. And then I imagine too, what’s important is tracking whether or not the operator actually followed through with that action or whatnot, and maybe they didn’t. You said it’s a busy place. Maybe it’s too busy, they didn’t get to it. They missed the alert, they missed it. Or maybe they’re stubborn. Maybe they’re like, “This thing’s always wrong. It’s always telling me the wrong thing to do.”
Is that ever a challenge, actually getting folks, the operators to not only see the data, even see the prescribed action, but actually to take the prescribed action as well? Is there a trust that needs to be built up over time?
Yes. However, we’re trying to build the system that will make that a lot easier to drive the action. Think of it as you’ve got a pipe in your house and it’s leaking. You can say, “Yeah, I fixed it.” However, there was an algorithm that detected that leak. That algorithm knows that that’s leaking and it can check to see whether it has been fixed. Okay? When you get to the part of the workflow that says, “Yeah, we found the root cause, we’ve taken action. This is what we’ve done, and we’ve now fixed it.”
Then you go to a verification phase where the algorithms can check whether it truly has resolved the situation, and then if it has, of course, you can close that event and you can then be give them some sort of recognition for the fact that the opportunity has been resolved and it saves so much.
Right, and then attaching the value to it. You were able to fix this leaky pipe, the story from my parents who had their home and they went away during the winter and they actually had a pipe burst. It froze. It was very cold and their home is and it burst and it broke. It caused all sorts of water damage to their house and if only they had something like an algorithm predicting that this could happen in this events and then act upon it, they would have saved gobs of money, or the insurance company would have anyway. It hits close to home there.
The leak is a metaphor for any opportunity. A leak, which could be the fact that for some reason, an assembly line is not producing parts at the speed that the standard said they should, or that another line producing a similar product has managed to set the bar higher and so there’s a gap. We call that a leak because is there any reason why we can’t learn from this line and drive efficiency that way? And so, if we can convert leakage into actual tangible financial dollars, it’s got to be meaningful. It can’t be hypothetical. Then that’s what we hope will be a very strong tool in the change management of how operations responds to analytics.
I think this is fascinating because I think what often data scientists tend to focus on, and I think you’re exactly right, is they tend to focus on this great model that they built and they tend to focus on the accuracy. Like if your model is not actually changing downstream behavior in a meaningful way, it’s worthless, quite frankly. It’s giving you lots of great predictions, but if things aren’t being done about it, then it’s really not moving the needle, so to speak.
In some ways, I think manufacturing is ahead of the game because the focus has to be on the value and has to be on the change that goes on on the factory floor and so much of this is happening in real time. So much of this requires adoption and folks to embrace it. Are there any tips that you think across the boundaries, across industries that you would recommend in terms of that change management problem? Or if you can get operators to embrace data science, then hopefully we can get everyone to embrace data science as well.
Yeah. I’ve seen example technologies that are nudging behavior. I’d probably prefer to not comment on that because there’s some ethical aspects of that to consider.
Because marketing companies are always looking at a way that they can change people’s behavior in a direct way.
Yeah. Get me to click on some ad and buy some product. Yeah, sure.
Yeah. My focus is on people in manufacturing being able to up-skill, have more interesting jobs, and be able to see that you’re not just a factory worker. You are now empowered to actually act and actually have an impact because we’re providing you with the data to support decision-making in real time.
And I imagine too, you embrace their curiosity. You could say like, “Look, if you want to learn more about how this is happening behind the scenes, what data we’re using in order to increase the velocity of this part that we’re making…” That curiosity, I’m sure you feed into that. I’m sure you’re like, “Absolutely. We’d love for you to learn more.” Because the more you learn about it, probably the better feedback you get from your folks on the manufacturing floor.
And maybe there’s a feature that you should be using in your model that they just intuitively know should be part of your model and you’re not. That sort of thing, that feedback can be invaluable, that domain—and literally on the floor—expertise, right?
Yeah. It can clarify the direction. So, if you get a data scientist who may have been the best data scientist in another industry, there’s still a big learning curve when they come into manufacturing. You can’t just drop people in and expect them to know all these nuances. And similarly, people in the factory who understand the process, they are very, very strongly positioned to be ambassadors for data science and to be able to provide that critical leg of the stool in order to make sure that data science delivers value.
And when I’m actually looking for data scientists, I specifically look for that manufacturing experience. Because I started off as a control engineer. I very quickly got into AI and machine learning, but I often look back at the mechanical, electrical process engineers who were dealing with data all the time, and I still want to know why more engineers don’t start to get into data science. I know a few are now, but it’s still…
Not only are the career options much more wider if you elaborate those skills, they’re essential skills in manufacturing to be able to connect the dots between domain expertise and data science. Very, very powerful combination. What we tend to do is we have teams. You have a data scientist, a domain expert, and then somebody who’s making sure that all the financials tie up. However, if one person can cover two or even possibly three of those roles, then that’s very powerful.
Absolutely. We talk a lot on the Data Science Leaders podcast about being able to work collaboratively with the business side and I think the domain experts and the business side to me are synonymous in a lot of ways. The more each side of the aisle, so to speak, understands the other side of the aisle, I think the better off everyone is. Speaking of which, switching gears a little bit, just focusing on you, Paul, I know in some of our conversations leading up to this podcast, one of the things that I found pretty interesting is just your role today and how it’s evolved.
You started off as a control engineer, and then you moved into the world of machine learning and data science and analytics, and now my understanding is you’ve moved a bit more into a strategy role as well. And I think one thing we don’t talk about on the Data Science Leaders podcast is, is there something beyond being a data science leader career-wise that you potentially can morph into? Maybe talk a little bit about what your role is today.
Absolutely, there are career options beyond data science. The title of Vice President for Industry 4.0 Apps and Analytics means that I lead a team within the Industry 4.0 program, and the Industry 4.0 program covers the foundational layer that I talked about, connected factory, the apps and analytics that I cover, and then automation, so our robots and cobots. Under that apps and analytics layer, I have several teams.
I have a team of data scientists and data engineers. I have an architecture team that is a cloud based architecture building the manufacturing analytics platform. And I have a development team, a product/development team. My role is no longer that tactical role of just doing plant focused data science.
I’m now looking at enterprise scale and that’s a challenge in itself because Stanley Black & Decker, for example, has 130+ factories. And factories tend to not be cookie cutter copies of each other. They tend to be their own islands, and so one-size-fits-all doesn’t really work. However, for enterprise scale, we need standardization in order for it to scale. And so, there’s always this tension between providing a standard approach and being able to scale it, and yet being able to respond to the individual requirements of each other individual plant.
That’s also, similar to this, tensions between a tactical strategy and the more longer term strategic. One of my roles is to make sure we tie all this together to build a system that is standardized and enterprise grade for analytics within manufacturing, but also doesn’t leave the plant feeling alienated because they’ve got this solution that doesn’t really fit their nuances. And building in that flexibility and allowing local innovation, but then having a process and a pathway for taking that local innovation that is successful into a system that allows it to be supported and deployed on an enterprise scale.
That’s what my day-to-day role is focused on. You can see the layers that are built up to that.
Yeah, absolutely. I think there’s a whole other podcast with you, Paul, just talking specifically about the nuances of working with manufacturing plants that may, for whatever reason, not be great fits for working with—maybe they have antiquated equipment or maybe they just have their own process, maybe they were a company that was bought by Stanley Black & Decker, and just has different equipment. It could be a whole host of reasons why they may not be great fits for the platform that the enterprise has built, but they still have needs and they still need to be addressed.
And they can’t just switch over and use the platform overnight, and I think having somebody like yourself who understands that, gets that, and is able to accommodate their needs and think outside the box, so to speak, I think makes a whole heck of a lot of sense. And you also mentioned a development team. Is it fair to say that your role today doesn’t always involve analytic use cases? Are you also building products for the wider factories as well? Is that how you expanded beyond the traditional data science leader roles and responsibilities?
Yes, that’s why it’s apps and analytics.
It’s development. And actually, part of my career, I worked with a company that built software products for manufacturing and I’ve done quite a bit of software development, product management as well in my career and that’s one of the reasons I ended up in this spot. The data science expertise, plus the software side as well. But they’re not independent. A lot of the applications we build have analytics built into them. We have these performance applications that I mentioned earlier around labor quality, etc.
If you just look at the asset performance application, that’s things like predictive maintenance and energy management anomaly detection, the analytics are built in. These teams work very closely together and actually; the model we use now is more a multi-discipline team focused on building solutions. So instead of having a development team and an analytics team, we have strong teams that have people in them from the architecture group, the development group, and the analytics group, so they’re working as an agile team.
Yeah. I love that. Absolutely love that. The focus on the use case, the focus on that app mindset and getting them all under the same umbrella working together as a scrum team, I think just is great. Paul, I think I could probably talk to you for another hour. This has just been absolutely eye-opening for me just learning about what is Industry 4.0, how the intersection between data science and the change management and how people use data science to actually take action, the process of tracking that in the workflow, and then how your career has evolved over time.
I think it’s all just been very fascinating. I think there’s a lot of great lessons learned here. Well, you said before, it’d be great if you had manufacturing experience as a data scientist, I imagine you’re also interested in data scientists who are interested in the challenging problems around manufacturing, right? I think from what I’ve heard, putting data science and seeing the actions and seeing measurable results and driving the value, that to me has just got to be so rewarding as a data scientist. I’d be certainly interested in working on your team from my standpoint. That’s great.
Yeah, it’s an incredibly exciting environment to be working in because you see the fruits of your labor very, very quickly and you have great teams to work with. And getting out there into the actual operating environment, it makes it very real and very tangible.
Yeah. Getting the ground truth and being able to measure ROI is a big challenge for other industries, but I think in your world, it’s very real and you can get it too relatively quickly and that is just very, very, very satisfying as a data scientist. We’ll leave it there, Paul. This has been fascinating. I’ve really enjoyed the time with you. I wish you all the best and as always, folks out there, if you want to learn more, can they reach out to you over LinkedIn and ping you to learn more from you, Paul?
Yeah. I’m on LinkedIn. They just look for my name and Stanley Black & Decker. I actually was an early LinkedIn member, so I’ve actually got /turnerp as my LinkedIn.
That’s impressive. That’s very impressive. Well, Paul, it’s been great. Thank you very much for being on the Data Science Leaders podcast and have a great rest of your week!
38:29 | Episode 14 | August 03, 2021
26:54 | Episode 13 | July 27, 2021
42:29 | Episode 12 | July 20, 2021
34:22 | Episode 11 | July 13, 2021
Use another app? Just search for Data Science Leaders to subscribe.
Data Science Leaders is a podcast for data science teams that are pushing the limits of what machine learning models can do at the world’s most impactful companies.
In each episode, host Dave Cole interviews a leader in data science. We’ll discuss how to build and enable data science teams, create scalable processes, collaborate cross-functionality, communicate with business stakeholders, and more.
Our conversations will be full of real stories, breakthrough strategies, and critical insights—all data points to build your own model for enterprise data science success.