MVP over POC: best practices for launching an AI initiative

Tad Slaff
|
Suitsupply
Product Owner Data & Analytics
We dive into best practices for launching successful AI initiatives, from choosing a use case, getting buy in from the business, designing the technical architecture, and scaling.

Nathalie Post  
On today's episode, I'm joined by Tad Slaff, who is Product Owner Management Information at suitsupply. And we'll dive into best practices for establishing AI initiatives, really focusing on how to identify use case opportunities, getting buy in from the business and designing the technical architecture. But before we do so, Tad, could you give a little bit of an introduction about yourself and your background?

Tad Slaff  
Sure. So my name is Tad Slaff. I'm currently the product owner for management information at suitsupply. So effectively, the scope of my role is I own all of the data, the reporting, and either own the analytics or heavily support the analytics teams doing the work. And from a product perspective, my product is the data platform, which is all built out in the cloud, we're using Azure for that. So I got my start in the data analytics and AI space back in when I was in university, I happened to read a book about how algorithms are changing the world. This was maybe 2008, I guess, 2009 2010-ish. And at that stage, my primary interest was in the algorithmic and automated trading space. And so I want to say that studies took a little bit of a backseat and then really focused on teaching myself how to program, to build automated strategies in the commodities foreign exchange futures a little options. And that naturally led to me exploring data analytics, data science, and this was a little bit before the buzzwords they are today. But that really kicked off the interest that and that I still have, and obviously in a different domain than when I started, but a lot of the same underlying principles, a lot of the same techniques are, are the same and transferable. So yeah, it's strange looking back that I've already been in the space about, I guess, 10 years now we're getting close on two. But yeah, it's really been really great to see the amount of adoption and the amount of interest that that's grown in the last couple years.

Nathalie Post  
Yeah. So professionally, I know you made a few steps. What did you do exactly? Can you explain a bit more about that?

Tad Slaff  
Sure. So I was able to leverage my interests when I was in university in the kind of quantitative finance space into a job at a brokerage firm, when I left University, and this was a really interesting role that that I credit with a lot of my experience, effectively it summarised, boils down to me being a quant for hire. And so a lot of the value proposition for this particular brokerage have a smaller more niche was that if you use them as as your, your execution partner, they would provide myself and my colleagues to go and sit down next to next to whoever the client would be, and help with whatever sort of quantitative analysis, automation risk analysis that that the clients needed. So the typical client would be professional traders, smaller, smaller hedge funds, commodity traders. And so it was a really great experience specially being fairly new in my career, travelling around seeing everyone who was very experienced in their particular domain, you know, these guys were at funds that were, you know, in the low end, hundreds of 1000s of dollars, up to 10s of millions, and then being able to see that, yeah, they had a lot of the domain expertise, they really knew what they're talking about. But when it came to more of the quantitative side, generally, they they, they didn't, didn't really know too much. And so a lot of my time was spent trying to translate their their knowledge into some sort of automated system, as I mentioned, whether it was automating a strategy, doing some risk analysis, creating portfolios, that type of work. Yeah, great. And so born out of that one of the consistent themes that I that I kept seeing was that a lot of them, really, there was an appetite to leverage these types of techniques, I think, particularly in finance, the writing's on the wall, pretty early that, that this was the way that the world was moving. And a lot of these these guys, even though they didn't have a technical background, they didn't want to be left behind when it came to that space. And so I kept seeing reoccurring need and a really a pattern for them wanting to leverage these tools themselves without having to rely on someone like me, that, that they would, you know, have to only use unlimited basis or have to pay for or whatever it may be. So born out of that. I left and started a financial technology company with the aim of developing a platform that would allow these professional traders and investors to leverage machine learning and analytics themselves without needing any technical experience.

Nathalie Post  
And what year was that exactly?

Tad Slaff  
That's a good question. I think that was around 20 must have been 2015. 2015, 2014 Actually no. So I think it was, yeah, early 2014. Now that I think about it, yeah.

Nathalie Post  
So kind of before it all boomed around machine learning.

Tad Slaff  
Exactly. So a lot of our time was around education. And, and really the the theme and our value proposition was quantifying your intuition. So how can we take that deep level of domain expertise that these traders and investors had after spending usually decades working in the markets, and being able to quantify that. So a lot of the techniques that we use were variation of association rule learning and ensembles of algorithms on the back end that would allow them to visualise the patterns that the algorithms are able to find, and still be able to use their own input in their own experience to try to get the best of both worlds.

Nathalie Post  
Yeah. And what caused the transition from doing that to the role afterwards, which was, if I remember correctly, at Accenture. Yeah.

Tad Slaff  
So. So, you know, I think the experience as running my own startup wasn't as financially rewarding as we would have hoped. But I think the experience was incredibly valuable. So after a couple years of doing that, and unfortunately had to roll up, roll up that and then I joined Accenture's advanced analytics practice within their digital analytics. And so that was a really interesting role. You know, as a kind of a big swing from working with, you know, five or six of us at a small startup to now, I was aligned to financial services and working with some of the largest banks and really seeing and helping them stand up their data analytics and data science teams. And so as a really wide variety of roles, either helping from a platform perspective of how do you develop one of these big data platforms, when, even a couple years ago, a lot of them, there are still some security concerns. And so a lot of them were building in house. So a lot of technical challenges in terms of building those really helping facilitate interactions between the business and the data science teams. And then also building in house applications that were leveraging machine learning and AI to help internal teams operate more efficiently. Yeah. And so then I did that for a couple years. And I really think consulting is a great experience, you learn a lot work with a lot of smart, smart people. But then Personally, I was looking for a new challenge and looking to move abroad. So took a little bit of a risk came to Amsterdam, and ended up in my current role at suitsupply. For about the almost exactly one year ago.

Nathalie Post  
Oh, yeah. No, that's I mean, your journey is quite impressive. I mean, especially when you look at how early on you were involved in everything. But also, from your experience within Accenture, you must have learned a lot from working with all these different companies in how they think about analytics initiatives and what route to take in establishing those. So do you remember your first really project machine learning within that role? And how it went?

Tad Slaff  
Yeah, so originally, I was joined, I joined to help set up a centre of excellence that was a partnership between the digital digital analytics practice and the finance and risk. So how can we leverage some more of these advanced analytics, techniques for finance and risky use cases? And, you know, going into it, it sounds, oh, we're gonna be building all these amazing models. And I can't wait to get my hands on the data and working with all these smart people. But pretty early on, really, we saw by far the biggest need and the biggest challenge even before we could even start thinking about what are what are some of these offerings that we could provide to the financial risk clients was that there was a so many challenges and so much work to be done in the data management space. And so it wasn't an area that I had had a lot of experience in coming into the role. But once you really start talking to to financial services companies, there's a lot of regulations around how you can use data, PII, data governance, data masking, how do you access control the data, who is the owner of the data, not even to mention the typical data science issues around data quality, data freshness, combining data from different sources. And so a lot of the work that I did initially was around yeah, we call it metadata management. So really, how can we leverage automated tools are vendors in the market are developing in house that would help with data profiling, data governance, data security, data, cleansing, and have all those pieces fit together and provide a more holistic solution. So I think that was a good Crash Course. And the reality is of data science and AI where it works all good if you're working on in a sandbox or on test data or some very limited scale POC. But once you start looking into, alright, how do we actually get this in production? How are we going to really generate business value? There's a lot that goes into it that someone with a traditional data science background, when in that sense, I mean, you know, technically, usually a PhD or masters in a very highly technical field wouldn't really necessarily think about would have a lot of experience with and then the big gap we saw was that the people coming from the business side, or people that did have that traditional background in data governance, data security didn't necessarily understand the AI or machine learning end. And so there was always this need to bridge the gap and make sure I want to speak in the same language, not the same goals and, and knew where the project was going. Or this, you know, technical and inherently complex space.

Nathalie Post  
I think it's, I mean, it seems to be one of the biggest challenges that organisations face these days, bridging data, science and business, and you have this emergence of all these new roles, such as analytics translators, or there's a bunch of other terminology for that. But how did you deal with that you really filled that gap in your role, or?

Tad Slaff  
Yeah, so I think it requires a little bit of having enough understanding of both sides to be able to at least translate and find the middle, I think, really what what we're able to help, what you're able to what makes people successful in those types of roles, is getting alignment and getting buy in and understanding who the key stakeholders are. And that's nothing new in the world of business. And and I think, you know, everyone likes to say, though, AI and machine learning, and so exciting, and it's really changing the world. But a lot of it comes down down to things like that. And in you know, the the added twist being, there needs to be an understanding of what these underlying processes actually are. And when we say machine learning, what does that actually mean, when you say AI? What does that actually mean? Making sure everyone's aligned with those definitions? And I think they vary from organisation from organisation and by context by context. And then really, what is the goal of the project and you know, those types of questions, which I think in other domains have been solved, or a lot of research around, and then but from what I saw, they tend to be thrown to the side a little bit, when it comes to AI and machine learning, everyone gets very excited, caught up in the hype a little bit or are sold on this vision of these automated systems, I can transform your business without really thinking about the the less fun side of when it comes to data management, data governance. Now you also have model management, model governance, but it's, yeah, it's areas that people don't enjoy talking about quite as much as the latest deep learning framework or NLP library or whatever it may be.

Nathalie Post  
Yes. So how do you establish that alignment? In your experience?

Tad Slaff  
Yeah, so I think in my experience, it always has to be driven by the business. And I think one of the one of the challenges is that sometimes these can be more tech driven in the sense that you'll have someone who I mean, from a data science background, or maybe has scaled up a data science team, and they're just so excited to run what they've learned. And they, you know, hear about all these amazing use cases that they've read about and by their favourite, you know, blog or favourite article, or they read some white paper on it, and they just want to get with it and run and run. And so I think you It always has to be come back to what's the was the business value? What's the business problem that we're trying to solve? And, and I think from that perspective, that makes a lot more clear on what the scope of the project is, how are you going to measure success? Who needs to be involved? And yeah, the underlying methods and the final output really could be radically different when you're having more of these AI applications are able to learn over time, or automate some tasks that used to require a lot of manual effort to providing a lot more valuable deep insights. And, and there the business value can can really be huge, but at the end of the day, it needs to be the business saying, Alright, here's the problems that we're face, here's the opportunities that we see. And then you know, ai machine learning is a means to an end it isn't it isn't the end in and of itself. And I think that's kind of fundamentally the way that you have to think about it where seems sometimes there was a lot of lot of that was oh, we're doing machine learning because it's the hot thing to do and we don't be left behind and and you know, a chatbot would be great, but at the end of the day, it's it's what what value is it what problem is it solving? What opportunity is it seizing? Really, why are we doing this and I think that's really the in really the the core of it and it seems easy and seems trivial. But getting everyone on the same page with that is really the first step and then that really sets the project on a good path. And from there, it's it's, it's making sure that you have an understanding conceptually at the very least about What's happening? And, you know, when we say machine learning, as I mentioned before, what does that actually mean? How is that going to be productionalized? Who's going to own it? Organizationally who works on it? Those types of questions are, are very important.

Nathalie Post  
Yeah. So when it comes to establishing use cases that are valuable for the business, how would you go about it? Or how would you advise organisations to go about this?

Tad Slaff  
Yeah, so I think really coming down to it, it's it's looking at, you know, will AI and machine learning solve? Is that the best way to solve this problem? So if we take an example of something like, Oh, we want to build a recommender system for product recommendations, right, very typical use case. But the goal isn't to build a recommender system, right? The goal is to be able to drive sales by cross selling effectively, is what you're doing. And so then it's thinking, alright, well, if our goal is to do cross selling, yeah, we have to be able to service up the right products for the recommendation engine, but those also need to be presented to the customer. So then you need to get whether it's the you know, the front end development team, then the way to visualise it on the page. What are the certain product segments? Or domains is all customer segments is all product lines that we want to do this in? Where do we think the greatest opportunity is? So coming at it from that business problem? And then, you know, the actual recommended recommendation engine is just one step in the process? And yeah, it's, it's could be seen as the area that's most novel and most interesting, but you know, at the end of the day that's on its own, it's not going to move the needle, it's that entire process of, alright, how are we identifying the customer? Are we then being able to make intelligent recommendations? Are we then showing presenting them to the customer at the right time and in the right, in the right place? And then how are we measuring success and improving from there?

Nathalie Post  
And so once you're at that stage, where you have identified a use case, how do you go about designing the technical architecture?

Tad Slaff  
Yeah. So I think even even before you get to that, I think there's a couple other things that you really have to consider over whether AI or machine learning is, is is the right tool for the job. And there are really three things I think that you need to need to need to consider and is the first one is is the problem domain, totally constrained. And so what i what i mean by that, if we look at an example, like self driving cars, right, and that space, the problem is completely open ended, you could have everything from construction to pedestrians, to poor driving conditions. And that's why it's really seeing difficulties. And even though the world's largest, most advanced companies, when it comes to this really struggle to do it. But if you constrain the problem to trucks driving on a freeway, or even a more classical example, like chess, or Go where there's no rules, the data doesn't change. That's really how you're you're setting yourself up for success from the beginning. And so I think when you're looking at Alright, is this the right problem for delivers these types of techniques? That's the first question that you want to ask yourself is, do we know that the rules of the game so to speak? Is that going to be changing over time? Is it something that we have data on? Can we automate the process of data collection? Those are all really important ones. And then also, can we measure success. So at the end of the day, if you're not able to show me just to add it and add a term called link and label if you're not able to show that the value generated from these techniques, and it's really going to be a tough sell to get to get the support from the business beyond small scale POC. So if you look at something like oh, we want to improve b2b sales, and yeah, the sales cycle might be six months, nine months, it's really going to be hard to say valid and to justify this type of initiative, if you're not able to really show until six months, nine months down the road, that that you're able to generate value from this. And so when you're looking at these use cases, it's all right, is the problem constrained? Do we have the data? Can we access the data? And then finally, is can we measure success? And so I think if you can have very clear answers for those three from the beginning, it really set yourself up for success, as opposed to having something that's a lot more open ended that maybe you'd still have better solved with a dashboard or, you know, hiring an extra analyst to look at it or whatever it may be. And yeah, those may not be scalable, but in the short term, if the business problem is we need to improve x, then that's gonna be the best way to solve that particular problem.

Nathalie Post  
Yeah. Yeah. So regarding the scalability, and because I think a lot of organisations they're getting success with POC's or you know, little pilot projects, but then they actually struggle to plug it in and scale it. And also the whole model lifecycle management seems to be a challenge. What are your thoughts on that?

Tad Slaff  
Yeah, so If I had to summarise it into one area is, is I'm not a fan of POCs, I think that might be a little controversial and I. And what I mean by that is I'm really a big proponent of MVP, which comes from the startup world. And it gets thrown around a lot, but stands for a minimum viable product. And really, the key here is how you're defining viable. So I think a lot of the times viable, just means working, in my view, that's a little bit of how I see a POC is, oh, let's just get something that works. But really, I think the important thing is to define viable as generating real value for the business. And so it's not just getting something that's working, we're not proving the concept, but we're choosing a very limited scope that we can actually deliver real value in. And so from that perspective, I think it becomes a lot easier to, to scale from that point, because you're already generating value, you've already demonstrated, you can show that this method is working, it's not conceptual. It's not on test data, it's in production. And it's and you're able to measure success for it. And so when you talk a little bit about designing that MVP, I always, from the beginning, like to think about it through processes and frameworks. So what I mean by that is, is really, what does that end to end in complete workflow look like, in from establishing the data pipeline to having, you know, cleansing the data to modelling to produce the final output. And so we're not defining the MVP as building a recommended edge in the MVP is actually going to be alright, how do we pull data from some back end database, run the recommendation engine, display the results on a web page, and then track whether customers are actually clicking on it. And so when you frame the scope of, you know, you can call it a POC, MVP, whatever you want. But if that is the scope of the project that you're trying to do, and that's how you're measuring success, then that's going to be a lot easier to scale up, because you already have the end to end flow working. And then you can identify All right, now, if we want to release it to, you know, to different markets, different product segments, it's sorry, identifying the bottlenecks that we've already built, and then enhancing those. And the downside of that, is that yeah, it's going to take a little bit longer to get set up initially. But I, you know, I think that's that trade off is worth it. Otherwise, you end up with these OCs. Yeah, it looks good. On paper, the business says, great, let's, let's, let's run with it. And then the team goes, oh, well, actually, we need to kind of completely restructure, you know, start from start back from stage one, where it's much easier if you're building in from the beginning, how does this fit into the existing tech stack? How is this going to fit into to their production? How can we look at it from from the end to end side and then process? And then when you're talking about scaling, it's really just focusing on which components are a bottleneck in that, in that complete flow?

Nathalie Post  
So how long would you typically consider for building an MVP?

Tad Slaff  
So I think it you really want to choose very limited scope for it. And so in my mind, if you can't get it done within a quarter, then you're doing something wrong. So it's in that should be kind of the entire process. I think another important thing that that comes with looking at it from the end to end view is that you really have to make sure that you're getting buy in from from the business that there is you have the support to do this. And then also, you're going to also have the cross team collaboration. So kind of going back to our example, the recommendation engine, yeah, you can have your team do everything that you want to build the actual machine learning algorithm that the core, but you need the support of data engineers to be able to get the data that you need getting high quality, you need support from the front end teams to to actually be able to display the results. And so when you're sitting down, you're scoping this out. That's those types of considerations, those the types of questions that you need to know and you need to have answers to upfront, you don't want to get halfway through and go, Oh, actually, I need you guys too, to build in, you know, an entire section of the website that can display the products, and then all of a sudden, you're getting pushback from them on how they had their own their own initiatives. They don't have the bandwidth for an X, Y and Z. And so I think getting out by and getting that aligning on that scope, and then really having the target of how can we generate business value from the beginning, makes the whole process a lot easier for it. And I think you'll also be surprised to find in that, in a lot of cases, the actual machine learning algorithm is only one really small part when it comes to, to solving the business problem, right? It's going to be the data that you have available that you're feeding into it. It's going to be how are you presenting the results on on to the final end user, it's going to be How are you defining kind of the success criteria when you're recommending the product? So all those types of questions are so much arguably more important than fine tuning the end model that you use with that, that I think a lot of people just get too caught up in, in oh, we need to have, you know, some very deep expertise is modelling that really need to understand these exactly the inner workings of this model. But when you're looking for how are we going to drive more more revenue for cross selling? That's, you know, attributing to the exact model, it's, I think it's going to be relatively small compared to those other components that I talked about.

Nathalie Post  
Yeah. Yeah. And so when it comes to the actual team developing these things, because there's a lot of different structures in which a team like this sits in an organisation, some organisations have Centre of Excellence is or other organisations have a more hybrid model. I think there's a lot of different ways of doing this out there. What has your preference.

Tad Slaff  
I think it's gonna depend on the organisation. But there are a couple really important factors that I think are universal, no matter how, whether you're doing a centre of excellence, or having it sit into the business, and the biggest, and the one of the biggest areas that I saw. And I continue to see that that I don't think gets enough attention is the importance of having domain expertise, sitting as close as possible to the technical side. And so I think this really comes into play from anywhere from the feature selection to some sort of any sort of anomalies or things that might look strange, or how much the final output look like, or even what is the objective criteria that that you're trying to optimise for, you really need to have the business providing input on that, and so on from the business, who has at least a conceptual understanding of what's happening on the technical side, because I think I know, I would see some of these, these PhD physicists, data scientists who you could sit them in a room and they would build this amazing model that would really buy all of their quantitative metrics really be great. And there, it's not overfit, and, and it performed well out of sample and on paper, it looks great. But the results aren't very sensical, or they're not optimising for, for the right, the right criteria. And so you really need to have that that business's input every step of the way. To to, to make sure that it's in line with that I remember one tangible example of, of where this was not done correctly, as there was, um, data science team, and they were building trying to build an internal search algorithm for, for the the internal wiki. So they had the cut has a massive organisation, they said, alright, we have so much knowledge locked away, and these internal kind of pages and Confluence pages, things like that. And so they wanted to say, slright, it'd be great if we could build some sort of way that as a user, you can search across it, and then surface information that you would find and, and so the, the, the team was like, Alright, we're gonna set up a PPC or against some super smart people working on it, but they were working completely in silo. And they said, All right, well, we built it, the recommendation works great. It relies on these tags of these pages. And that was the best, the best metric. So you know, everything is perfect. And then they present it to the to the board, and everything's great. And then someone goes, oh, where do these tags come from? And then someone you know, stay in the back room goes, Well, actually, my team, you know, went through and tagged it all end-to-end. And so now, when you're thinking about it, oh, yeah, worked, you know, for these 1000 pages that we did, but now we look across having 10,000 pages, are you really gonna have to have someone sit down and tag every single one of these, for this model for this search algorithm to work? And that's a, you know, a great example of how they have someone from the business that was sitting next to them, they would have identified from the beginning that oh, yeah, this was all done manually, it's not really gonna be valuable input for the model, because because the whole point of this was to reduce manual work, if you're gonna have someone go through every single page, you know, you might as well you know, that's already solving the problem, you're not needing any sort of source algorithm from there. So I think that's that scenario, where it really really shows the importance of having that domain expertise. And, you know, that kind of going back to earlier my career, you see that with these traders, and these are, you know, guys in their 50s, not technical at all, but they would really understand the domain, the space, the financial markets, and so being able to leverage their experience leverages their understanding. And then the trick is, alright, how can we quantify that or how can we transfer that knowledge into an algorithm that's, that's a lot of the key work for it.

Nathalie Post  
Yeah. And how do you deal with this now within suitsupply?

Tad Slaff  
Yeah, so I think suitsupply is is somewhat of a unique case in the sense that a big part of, of the suitsupply brand and the value offering is still having that personal connection. So if you visit the store, which I highly recommend that you do, you know, a lot of it is, is, is getting the one on one attention from the sales professionals, they'll walk you through every step of the way, in terms of finding the right fit for you, what's what's gonna look good, have you tried this, here's a great style that I think could could fit, whether it's, you know, a suit that you're wearing every day, or just going to a wedding or things like that. And so, part of the brand is having that personal touch, we see that with the customer service side, if you call the customer service lines and the chat, there's not any, you know, press one for this, it's not any automated message you receive, a lot of that is, is still having that that that connection, and that that personal, that personal touch, that's not going to that's really important for for the brand. And so a lot of the work that we do is how can we augment? Or how can we improve? Or how can we make more efficient the work of whether it's a sales professional or customer service agent, some of the category teams when it comes to buying who really understand the product, really understand the customer. And so it's giving them the right information that they do to make their make their jobs better. And so a lot of the time that I spend here is really trying to understand, what is the business problem? What are their intuitions about why, why they want to do it this way? What data is important to them? How do they want to look at it and really understanding that the problem on the business end? And then how can we use some of these analytics techniques to solve that. And so really making sure that everything we do is driven by the business, it's not something that that we're running with, and then trying to drag the business along and get buy in for it's up front, what are the problems you're having? And then how can we solve those. And I think you'd be somewhat surprised that a lot of these, a lot of these seemingly AI and machine learning problems, you can get maybe 80% of the way there with a well designed dashboard, and something else that I think a lot of people don't necessarily want to hear. And yeah, if you're Google or Facebook, that's not going to be the case. But for a lot of the times giving the right people the right information, at the right time is is going to have much, much better results than trying to automate the whole process. And once again, that comes back to what use case you're working on, when all the problems that we're facing are open ended, the constraints, and the rules aren't very well known up front. And so trying to account for all those test cases, whether it's things like, you know, certain different events during the season, you know, things like maybe it's a wedding, maybe it's a certain fashion trend that's coming out, and you're gonna drive yourself crazy trying to account for all of these these edge cases. But at the same time, you give the knowledge to someone who really understands that who has seen a similar case, you know, two years ago, and, and you know, this one store, then then they can identify pick up on it. And then we try to create the feedback loop or that comes back to us to to improve and moving forward.

Nathalie Post  
Yeah. So within the, let's say, fashion and retail space, where do you see that artificial intelligence or machine learning can make the difference?

Tad Slaff  
I think a lot and going back to the specific case here, suitsupply, is along the supply chain. So we're a very vertical, a vertically integrated company in the sense that we own the end to end supply chain all the way from the fabrics, creating the mills, the factories that produce it to the warehouses. And so there's a lot of opportunities there for us to get smarter in terms of how we're planning our buys, how are we starting the budgets. So I think that's that's one and that's a lot that happens behind the scenes that you don't necessarily see. I think the second is around, empowering the the face of the business, whether it's a sales professionals or the customer services agents with the most valuable information at the right time. And so that's providing 360 views into the customer, maybe not providing completely automated product recommendations, although that's one very interesting use case for the website, but giving that to the sales professionals to sell, Alright, here's what we think the customer would want. They're able to use that as an input, along with leveraging their own expertise and knowledge with the customer. And so I would say it's, those are the the two ones that are more interesting to talk about. And then I think the other one that that really adds a lot of value to is is just automating process. And I think that's whether you can classify that as RPA or whatever it may be. But if you look at it, alright, what's the business problem that looking to solve? A lot of the times the analytics piece is one small part of it, and there's a lot that needs to be done. Maybe it's just automating the data collection process instead of receiving by email, setting up some FTP that picks it up, processes it and then makes it available to them. So they're not having to do the work. So I think it's going to be different depending on each company. But at the end of the day, it's really coming down to what is the business actually asking for? Is this in what's the best solution for that? And then, remembering that a lot of these AI machine learnings are a means to an end and not an end in itself. So I think that's one one very important message when it comes to, to when you're trying to figure out how to be successful in this space.

Nathalie Post  
Yeah. And to what extent does the cloud influence this as in like, speed to move, etc, what are your experience with experiences with that,

Tad Slaff  
It's really amazing, in how much even the last two years, some of these big cloud providers have the tools and the frameworks and, and the offerings and the services they provide, can let companies do more for less. So for instance, we're working in Azure, but I've experienced with, you know, Google Cloud, as well as AWS. And I won't get into that this pros and cons of each one. But the the amount that a small team of, you know, column data engineers call them, machine learning engineers, call them, you know, BI developers, whatever it may be, the amount that they can do with, with working greenfield with open ended, being able to develop kind of cloud native is, is really amazing. I mean, if you look at something like being able to ingest real time streaming data, you know, two, three years ago, even, that's taking, you know, a team of five months to develop, now, you can have a team of to set something up and you know, in a week or two, and then it gets it up and running just by leveraging the offerings that they have. So from that perspective, is just doing, they'll be able to do a lot more with a lot less, a lot of benefit to and scalability, which you hear, you know, now we're running out of space, we need more performance, then it's just upgrading the next year of operating offerings, which you can do from a dashboard and five minutes and instantaneously see, see the results. So that's really been really been game changing. And then also, being able to, to offload the, the maintenance and infrastructure to these providers. And so really allows us to focus on what our core competencies are, what's the areas that are unique to our business, and not really have to think about, about a lot of these headaches that come along with managing our own infrastructure. That, once again, even just a couple years ago, you had to have system admins, and for teams, all of that all working together, just to get keep the lights on. So now a lot of those resources can can be helping push the envelope and taking the next step. And really, what can we do to drive the drive the company forward?

Nathalie Post  
So you, you mentioned briefly the types of roles like data engineering roles, and machine learning engineer roles. I think there's a lot of, I don't know, I would almost call it noise around these roles and what they entail. So can you give an explanation of what you look for in people with that work within that space?

Tad Slaff  
Good question. So I think one of the just the the nature of that we're working in now, the two things that we look for in our new hires are, are one, a strong understanding of the underlying processes, and really good idea of, of, theoretically, how are things working, whether that's data modelling, whether that's productionising models, whether that's understanding how to scale up infrastructure, so I think there's, there's not going to be any way around that. So I think you really need to have that strong, strong, deep understanding of the theory, separate from the actual tools that you're using to get there. And then on top of that, and it may sound a little bit contradictory, we're looking for people that can are able to learn and pick up new things fast, because every no just yesterday is your comes out with some new new offering that oh, this might be interesting. For our team sound seems like it's solves a good one of our problems. So we need someone who can, you know, pick it up from scratch and, and be willing and comfortable running with it and getting getting it into production, you know, within within a single sprint. So those are the two things is that having that strong conceptual understanding, as well as the ability to learn and keep pace with the the amount of new offerings and products that are being released all the time, whether it's by the big cloud providers or other independent vendors. And then I'm not sure if it was part of your question, but in terms of the scope of the roles the way we look at it. I see data engineer As more a lot of building the ETL jobs, the data pipeline. And then that can kind of encroach into a little bit of BI developer, which I see more focused on the data modelling side. And I still very much see that there's a need for those types of roles, even companies that are really see themselves having machine learning and AI at their core. Because having that a well well thought out and well structured data model on the back end, just solve so much problems, if you're if you're a data scientist, and then machine learning engineers can be a little bit more nice, they need to have an understanding of how these models actually run. And as well as the infrastructure side, the data pipeline side, and be able to put the pieces all together to get a model running in a production environment, which is no easy task. But I think going back to thinking about it as terms of, alright, from our initial scope, we looked at this as the end to end process, it becomes a lot easier than having something that's only running locally, or only running on test data, and then trying to fit that into a production system, somehow is a lot more challenging than if it's designed from the beginning to fit into a production system.

Nathalie Post  
Yeah. So when you look at those roles, for example, the overlap between a machine learning engineer, data engineer and a data scientist, do you believe in more kind of cross functional profiles, or more of the specialist type of profiles.

Tad Slaff  
I am a big fan of the cross functional profiles. I think having really needing to have clear handoffs is is difficult across teams, if if I think there's, you know, there's always gonna be a specialisation between data science spectrum of the data engineer, spectrum, machine learning engineer sits a little bit in the middle. So I think you need to have the data engineers having some basic understanding the data science side, and the data scientist needs to have some basic understanding of the data engineering side. And so I at the very least I see them sitting on the same team working very closely side by side, whether that sits in the business, whether that sits in some sort of Centre of Excellence, but it's important that they have a good understanding of what the other team does, how they work, what they're actually doing. And, you know, there are countless examples of, you know, handoffs, where, you know, someone gets handed a model told to put into production, and then they have to spend three months trying to rewrite the code from Python into whether it's c++ or something else that runs in a production environment, having no idea or, you know, having some data scientists having no idea how the data gets to them. And then when they try to go push it into production, they realise Oh, well, this data source is only, you know, only get sent to us every week, whereas this one is coming in daily. Whereas, you know, here, this is a, you know, this source requires some manual input, like my previous example, or whatever it may be. And so, while you're always gonna have people that have their core competencies, I think it's really important to, to have to sit at the very least sit side by side and work very closely with the counterpart when it comes to kind of three distinct roles, the data side, the modelling side, and then the production realisation. And in a perfect scenario, you have people that that understand the whole process.

Nathalie Post  
Thank you so much for the thorough explanation. Maybe a final question, just out of pure interest. But what are you excited about right now, when it comes to a new development or trend within the space of let's say, analytics and artificial intelligence?

What am I excited about? So I think for me, I really like to see the transition from, from insights to a little bit more of, you know, another very consulting term from insights to action. But what I what I mean by that is, is not necessarily showing you that oh, there's a, you know, we're seeing increased revenue in this particular segment, but being able to get insights into why that's happening. So I think another another way to say that would be trying to move from correlation to causality. And that's a very difficult problem. I think a lot of people look at correlation and see causality. But there's some really interesting work being done in the space of really, how do you infer what the causality is, and not just the correlation, and I see that as something that was very, very difficult to do just a couple years ago, but now we're starting to see some really cool techniques in this space that are that are able to not only infer that but present that into a digestible consumable manner to someone with a with a more of a business background. And so yeah, there's some good examples, some companies doing some really interesting stuff in the space. And so I'm really excited to see how that how that goes forwards. I think that's the next logical step, especially when it comes to more of the digital support, you know, improving the efficiency and and all the information that are given to people who really have a domain understanding of it. And and, yeah, so I think well, I'm excited to see how that how that progresses in the next couple years.

Nathalie Post  
Great. Well, thank you so much that I really enjoyed this conversation with you. And any final message?

Tad Slaff  
Nope. Yeah, well, thanks. I enjoyed it as well. So I think for me, if you if you have a couple of takeaways from this is that one always look at this from the what is the business problem that you're trying to solve? Is machine learning AI really even what you need to solve that and the best way. Two, think about MVP's over POC's and MVP really as define defining viable as something that's generating business value from day one. Three, from the beginning, think about the framework, think about automation, think about the end to end process. And then always, always link and label. So how much what is this generating in terms of dollars in terms of cost savings in terms of time savings that you're working on? And I think if you kind of go through those, or at least frame that the question, the frame the problem in those in that in that space, I think you're have a much higher chance of success for getting these initiatives off the ground and not stuck endlessly releasing POC's that don't really go anywhere. So I think those would be my final takeaways. But generally, it's a exciting space to be in and excited, even from the space of suitsupply to see how we're able to take the brand to the next level by leveraging some of these techniques.

Nathalie Post  
Great. Well, thanks for this amazing summary. And yeah, thanks again.

permanere audire

Continue listening...

newsletter

Want to stay up to date?

Sign up for our newsletter, and we’ll keep you posted on our research, podcast and other AI goodies.
* We don't share your data. See our Privacy Policy
Thank you! You've subscribed.
Oops! Something went wrong while submitting the form.