AI Implementation: From Buzzwords to Real Business Value
In the ever-evolving world of AI, the key to success lies in setting realistic expectations, understanding the educational aspects, and reimagining AI beyond the buzzwords.
It's essential to focus on solving real problems and delivering tangible results.
This panel discussion, held at The AI Summit New York 2022, was led by five distinguished panelists:
Joe McKendrick, Analyst & Contributor, Forbes
Sai Zeng, Head of Data Science, Distinguished Engineer, UBS
Neil Sanyal, Global Head of Research Marketing, Morgan Stanley
Rajeev Sambyal, Director, Digital Assets & Advanced Solutions (AI/ML), BNY Mellon
Sai Sharanya Naila, Principal Data Scientist, Nike
While AI has gained immense popularity and C-suites are eager to embrace it, the focus should be on solving specific challenges, not just technology. Managing realistic expectations and setting the right course is crucial.
Don't miss the opportunity to delve deeper into the world of AI this year. Secure your seat to gain valuable insights from our distinguished panelists and learn how to make AI work for your organization. Let's transform the future of AI together!
Okay, welcome back, hope everyone had a great lunch and having a great conference, really is a great conference. We've covered a lot of ground and we're going to do so today with this panel. I'm really pleased to be moderating a panel with some industry luminaries here representing some of the world's most prominent organizations to talk about starting your AI process, you know, what should be the footprint in your organization of your AI efforts, and I'm pleased to be joined here by Neil Sanyal. Excuse me. Sai Zeng, another Sai – Sai Sharanya. Oh, my apologies. I know don't get that right. And Rajeev? Rajeev Sambyal. Okay, and let's start off with some introductions. First of all, I'm Joe McKendrick. For anybody that's contributed to Fortune magazine also have written a number of AI pieces for Harvard Business Review. When we start with them...
I'm happy to start this up. So I'm Neil Sanyal, I'm a Global Head of Marketing for the research department at Morgan Stanley. Marketing operations and analytics is really my focus. So everything in the realm of sort of matching and distributing content to our clients.
01:34 Sai Zeng
My name is Sai. I'm heading data science team for UBS good functions, and was focusing on using AI cloud technology to try to help or transform the core function, areas, for instance, the HR, legal finance risks, you name it. And before that, I spent 17 years for IBM Research, where I held various roles. For instance, Chief Technology Strategist, Principal Research Scientist, Chief Architect and so on. Again, i was focusing on the cloud AI technology innovation to drive the IBM Business. And I think the other side, I actually have over 50 patents under by name, and also over 50 Technical Publications under in my name, and I got a PhD from mechanical engineering from Georgia Tech. Thank you.
02:32 Sai Sharanya
That's a lot. That's amazing. That probably could just probably people can find me when they search for Sai. So I'm Sai Sharanya Naila. I'm currently working as a principal data scientist at Nike. So I lead the sport activities based with the Nike. So I know Nike has a bunch of the retail side, but I specifically focus on the fitness side of things. So which is Nike Training Club and Nike Running Club. So these two are Nike’s fitness training apps, and anything that's related to AI ML driven features that has to be launched within these two commercial apps. That's something which is going to come out of my team. So that's on the commercial side. And on the R&D side, I work I lead the machine learning vertical for connected fitness. So this project is around how we can use wearable data from wearable devices like you know, Garmin watches healthcare data that's coming from Apple kit, Apple watches and a bunch of sensor data which we could collect from underfoot sensors. And just figure out how can we use the best metrics that we can collect about an athlete's work out in terms of their running experiences and come up with experience in terms of how can we make an athlete's workout more enjoyable regard regarding to running or regarding to like reducing injury while running. It's an R&D work, which I'm currently leading. Pretty exciting work. So glad to have this opportunity to be here. Thank you, everyone.
My name is Rajeev Sambyal. I head up the product development for our digital asset custody product and some of the work we are doing in digital assets at Bank of New York Mellon. And I also lead our Advanced Solutions team where we do a lot of work in AI and machine learning. So we have a lot of like traditional use cases in finance and all but what we are also exploring and are excited about is exploring the intersection between these last sets and machine learning. There are tons of data on blockchain, and you know, lots to explore there. So we're doing a lot of work over there.
Okay, great. Great we, we have three banks, and a sportswear company. So it sounds like a movie, right? Okay, fantastic. Fantastic. Well, let's, uh, let's talk about, you know, as mentioned in the title of this session, you know, we're talking about small wins with AI, the importance of small wins. So what do we mean by small wins? Neil, let's start it off. What's the definite? What's your definition of a small win and AI? And oh, versus a big bang? Or a great victory? Or whatever you want to call it?
Yeah, I mean, I think the interesting thing is about sort of, when I think about it is about what sort of what can you do easily. And I think that depends a lot on your resources, and about where you're at, and how much your company is bought into your beliefs and bought into what you're trying to do your seniority level, etc. So I think what's small for one group might not be small for another group, or what's big for one group may not be big for the other. And I think it's different to on the one of the things I've found over the years is a difference between what's big on the IT side, versus what's big, from a business impact side, there may be things, there may be a lot of things that are technically challenging, but may not have hardly any or any surface value for the business, they may need to do, you know, a systems upgrade behind the scenes, or server upgrade or something like that, but the business may not give credit for that. And so I think when you're trying to pick your small wins, or what's obtainable in your planning sessions, I think a lot of it's about sort of what, what resources do you have at your disposal? What are you able to do realistically, in that time, line and, and what will actually surprise and wow, managers or people that you're trying to impress, whether it be clients, or whether it be managers, and usually, if you can get clients to be impressed you managers will be too. So that's what I would sort of say about, how I think about small wins is just sort of trying to grab bite sized pieces that are realistic, given what resources I have, sometimes I have more resources, and sometimes I have less, but it's sort of having a future vision of how you're going to connect those dots, have lots of small wins that I would, I would say, putting that together makes for a good final product that tries to wow, people.
07:31 Sai Zeng
So I think maybe, from my perspective, small maybe means something which incremental improvement that could be, it's also could be some, let's say the advance on the technology side, like for instance, the algorithm, in the past does one models and another model is performing better. So that could be technical side has a small wings. If you need to start something new, it could be means that something smaller scope to begin with, what is the inability, let's say, you know, now I'm serving for one these divisions, later on, you can grow into much bigger. So I think a small can define in different way. But I think the end of the day, I will say it's something we're what our technology wise, you have, you know, some small, you have implement, or been inside, you have some value generated. And hopefully, it can really lead to much bigger expansion in the future. Yeah.
08:31 Sai Sharanya
Yeah, according to me, small when would be something to firstly figure out have the bigger vision in place. And then, in your bigger vision, we need to like, we follow the strategy of identify the different jobs to be done, make sure you know, your customer segments, the customer needs and what are the use cases that you need to build out things starting from the big picture needs to like the granular needs. And once you have that identified, then it's all about implementing the first thing in the right way, as well. And in order to do it the right way, I think we should just start with like smaller steps. So that's something is being able to implement and test out, doing MVP, MVD testing with the segment users and making sure that you are able to get the feedback. And you can also work on improvements based on the need, based on the feedback you get from your customers and then like improvising on top of it. So gradually iteratively delivering the product is something which I see as a small win. And once we gain the trust from the customers, and we start learning things about the customers, I think we can start, you know, increasing or putting together like the entire building block that's needed for our bigger vision. Yeah.
I think, like, it's the same, like, Neil touched on it, right? So it's bite size increments. And it depends upon where you are in your journey, right of AI machine learning, people who are starting small or starting new, right? You don't want to be putting in $20 million budget and get GPU CPUs and hold lots of infrastructure, right? And when you don't have business cases to solve for, right? And also, if you are like far along in your journey, like, what is your business case? Right? Like, do you want to show the stakeholders the benefit of the solution, like incrementally I don't know, you don't want to go in a five-year plan, right with implementing something but you know, the technology has shifted, and the requirements have shifted here too, right. So that's why I think it really depends upon where you are in your journey. And you take those bite size increments and deliver it incrementally, right? I mean, agile is not just a buzzword, right, it's implemented across the board. So just deliver, you know, what is needed.
Great, great. If you're familiar with Tom Peters, who's written a lot of management works. In his initial work in such a Search of Excellence, he talked about Skunk Works, I think it was Lockheed Martin, the manufacturer had the Skunk Works and kind of independent from the main organization, perhaps a small group that was tasked with just going with it, taking their ideas and running with them independent of management interference, at least initially. Do, let's start with Sai Sharanya. You talked about some of the work at Nike, a little bit a few moments ago, did that start off as independent from these ideas from the mainstream business?
11:44 Sai Sharanya
Yeah, so um, so just trying to keep it as standardized as high level as possible without giving all the details but with Nike we have something called as a startup called Valiant Labs with a Nike. So Nike does encourage, so we have entrepreneurs and residences, they're essentially like CEOs of of that particular division. So they encourage employees to like to pitch out ideas. And once that idea is getting the traction and has leadership alignment, then there's budgeting, aligned, budgeting associated, given to that particular project. And, and this is one such initiative, which I was talking about. So on a quarterly basis, we'll have like steel committee meetings, we wanted to make sure that we deliver progress at every quarter to get that trust from the leadership that we are going in the right direction. And given a lot of products, which you wanted to build out need a lot of alignment across multiple organizations, like when I talk about, we wanted to build an experience for reducing probably injuries for a particular athlete. It's not just around the technical details, it's also around the sport science component of it, understanding, analyzing data that's coming from a bunch of these sensors. And there's a lot of research back and in terms of figuring out how can we identify symmetry between when somebody's like running? What is symmetry? How is the balance, and coming up with the baselining metrics, like when somebody is like doing a squat versus whether somebody is like running? Whether somebody is doing a two-legged jump? Is there? Can we identify balance from it? Can we and what is that balance looking like and, and also a lot of these things do not have to shouldn't be prescriptive. So you wanted to like put in a way that it's in a format, which, which is helpful for the users, but also like not forcing them into a certain way or something. So there's not just the technical aspect, but also the sport science component associated with it. And then there's a hardware requirement because we would be like trying to gather data from sensors in wearable devices. And that's a whole new set of all that we work with, and just getting a buy in just getting the alignment across. And, and given this as more of an R&D aspect. We wanted to test it on a limited set of population. So it's around how shall we come up with this whole new app, which where we want to like run our tests with a limited set of population and gather feedback. And when we are confident enough that this is a product which can be commercialized, that's when it goes into these commercial apps like NBC and NRC. But yeah, all of these things are some of the considerations.
So it sounds like a very important role a corporate culture, right that that supports and sustains these efforts and helps the process This move forward. Yeah. And our banking folks, does that happen in the banking world, we have little independent spin offs that start the process rolling the start the innovation rolling. Anybody want to tackle that?
I'm happy to Yeah, if others want to jump in, but absolutely, I think within I work within the research department, in the business side, and we have very limited budgets compared to probably other parts of Morgan Stanley that, you know, are closer to clients and more well-funded. And we're constantly sort of trying to do more with less. So proof of concepts I found is a, like a very impactful way of getting management's attention of sort of what technology can bring to bear. So you know, wanting to put advertisements on the sides of our research reports and being able to say, Oh, if you like this, you'll probably like this content as well. And those sorts of recommendations, engine problems that we've all talked about, and you know, interest, neighborhood problems, etc. So those are the types of things that like, I've, I can put it on a slide 20 times and present it to senior people and say, Hey, you're missing all these opportunities for people to be more engaged with our content. But until you actually show them like, Hey, you clicked on this, and your page is different than hers, and different than his, and all of a sudden, they're like, Wow, we can do this. And that's like, yes, that's what's been in the slides for several, several quarters. So, so proofs of proof of concepts, it can be super bare bones, but it's way more effective than, than slide presentations. Because trying to show, you know, what you're able to do, even in very small scales is extremely impactful. So that's been my sort of, you know, way of trying to infiltrate and get people to listen and understand the opportunities that are available because of technological changes. So Skunk Works is actually a term we throw around a lot in our group is like, if we're not, I often say like, if we're not doing it, no one's gonna do it. Like, we got to create it within this group.
17:00 Sai Zeng
Talking about the UBS experience. So we do have hackathon, the firm, wide Hackathon is really you can consider, it's a two day effort where you know, you can form different topics, different teams coming together, just to you know, hack on a small problem. So as outcome, some a problem is getting to roll up as more concrete project initiative. So that's firm wide, that's also running on different, you know, the divisions as well. So that's just one format, as a firm wide, but I think I personally think that the every organization has opportunity to come up with new ideas, you know, ratios build their POC proof of concept during the pilot and to commence the management, this is rising to do, and also invest in the long term, you know, initiative become official, and the going forward. So I think that's, that's my experience. So but of course, you know, from IBM Research never been shy of innovation opportunities there.
Absolutely. And Rajeev, how's it?
Yeah, I think that's because only I don't have a team of skunks, right. So, but we do a lot of these hackathons Design Thinking sessions, a lot of clubs. So all of this is prevalent, I think the one thing which we make sure that we do is be time bound everything, right? So it cannot be something which is going on goes on and on for weeks, months, and something just people are spinning wheels. So we have a framework, we say that, okay, you have four to six weeks of exploration, right? Like if there is an idea, come up with an idea put together a business case, but you got to limit that within this time period. And after that, we see you know, what comes out of it, right? So after four or six weeks, if you think that okay, the idea, either there is an issue, technically, or there's an issue in the data, or there's an issue in the business concept itself. So then you kind of shut it down and move on. Right? So, so I think you, at least we, we have innovation embedded right in our overall business, and it helps kind of generate new ideas and also have the startup culture of you fail, you fail fast and move on. Right, get to the next idea.
And we'll start with Rajeev for the next question. What can you cite an example of a small wind type of project that really eventually bore fruit?
Yeah, I think at least most of the projects, which we do, when this we start small, right, so I can I can, you know, the one of the projects we recently delivered and I was talking about it yesterday also that we did some fee predictions like we have a sec lending division, right where our front desk actually ends up the securities, people who are in finance, they might understand what SEC lending is a securities lending is basically you have client positions or client holdings and you can lend out the securities out of the market and you can generate some income for them. clients also as well and charge something on top of it. So long story short, actually, you optimize the fee at which you learn, right? So, and there's a huge market to that it's a global market, right? So the way we started whether you take a basket of securities and see how the modeling works, right, like, you know, does it work? Does it impact? Like, is it actually you back tested last 5, 6, 10 years, like, do some shock testing? And so that's how you start small, right? You incrementally increase the basket size and see if it is working? Right? Is it actually generating returns? We want? Right? So if it is not, so you don't want to start with a basket of 10,000 securities and go on and say, okay, nothing works after seven months, right? So that's just one example of how we started small, but every project starts like that, right? So at least the way we operate, or we do it like we don't, we don't embark on a six year journey, right? Saying that, okay, we'll deliver it to you two years later, right? Like, so you do incremental steps and deliver that to business, right? And that's how projects are successful most of the times, right? So so you get into this this iterative loop, continuous loop and get feedback.
Great. And say Nike, yeah, yeah.
21:06 Sai Sharanya
I think probably, I'm just gonna give an example while I was working at American Express. So, so at Amex, a couple of years back, we launched a product called Amex personal loans. So when we were having this discussion about launching this particular product, I was sitting on the consumer data science team. So we were basically responsible for identifying customers who would be more likely to, actually accept the personal loan. So in order to so in order to do that, we have to like go with the data we currently have in terms of the transaction data, which we were able to collect with the same set of users, which we had. So we were trying to identify patterns, like, hey, has somebody made a really huge purchase, purchase above a threshold, as you know, certain set of 1000s of dollars, or has somebody been doing a lot of furniture shopping, probably, it's time they, they might have, you know, probably bought a home or, or, or probably we were also looking for a bunch of very interesting signs from the data. And we were like, trying to predict, come up with probably this might be, this might be a transaction or a person who would be more likely to accept this personal loan. So we were able to first build out a response modeling response model, and we were able to like test out with a certain set of population when personal loans was launched within a certain for a certain set of customers, we have seen that the response for modeling results were pretty accurate when we ran our campaigns. When we went across and looked at the post campaign analysis, the results were pretty promising. So that's how I started off with building response models for one product, which was personal loans. And within one channel, which was email later, we've blown it up to a bunch of several different channels like online phone call. And not only that, so once we have seen the success, and we already were able to like get an get an idea in terms of how these, how can response modeling, be useful for targeting and marketing, we then actually across so our team was responsible for consumer targeting. So anybody who is already an Amex card member, there are a bunch of offers like upgrades, cross sell supplementary cards. So I was able to build out at three different models. It was covering all sorts of credit cards, charge cards, which Amex has, and we've tested out with personal loans, and we've blown up incrementally. And then we were able to like build a platform, which was hosting at 83 different response modeling models. And that was our very first next gen framework, which we were able to build. So yeah, like, I think getting it out the door. The first one was very difficult, because we had to like go through a bunch of approvals and regulatory compliance and a bunch of documentation, a lot of questions to answer in terms of why is the model doing a certain way? What are the features that went into it? Why using XYZ feature? And how are you using it and once we were able to cross the first hurdle, the next ones were like, pretty easy left. Yeah, that's my experience.
24:37 Sai Zeng
So I maybe I wanted to approach a different perspective because I think the a lot of initiatives starting from small right, the incremental steps or the small, smaller scale, but I will say at a very early stage, you know, the bigger opportunities actually ahead of you is not because I started small meaning I only this is my narrow focus and not knowing what's outside the scope. So I always say it's, the I just give one example. So last year we did ID optimization. So, as a firm, you know, financial firm, you typically spend like 2% of your revenue, just based for your infrastructure. Right? That's how much you're spending a half. And so we just did a data analysis, of course, group functions, we find out is extremely underutilized. Yeah. So of course, when we started the journey, say, hey, let's help our application owner to drive down their cost. We started with one application as the pilot, right. But we know opportunities, many, you know, million dollars, you know, half of us, but we started with a small applications, we made a successful, we also, you know, and we went to, you know, broaden across different applications in the divisions, but not next journey we're going to embarking on to the whole firm. So I think the is always keeping in mind starting something small doesn't mean you don't know, there's huge opportunities ahead of you always keep that in mind.
Yeah, I think playing off of what's Sai saying, I think it's interesting, too, I think it was I think about like a portfolio of experiments that were sort of running at any given time. And we're always sort of, I mean, don't do this perfectly, but I'm always trying to have some sort of big win that people are excited about and are, are excited and talking about. And then behind the scenes, there's always some other things that are like the building blocks, right. And those building blocks, I'm not going to go tell the senior most managers, oh, we've done this thing that's going to eventually, you know, I'm going to pair it with these four other things. And someday, that's going to be a big win, I think, you know, and I, those I don't even shine a light on, because I know we're not ready yet. It's not primetime, we haven't, you know, it's going to take four more components and three more models and, and, you know, for more data attributes, and in a matching engine for this to actually work. And so I'm not going to even talk about that in the meeting, because that's not what's gonna get people excited. I'm going to talk about the thing that, you know, we did roll out recently that has taken three years to get to fruition, but they don't really realize it's taken three years, because we just started talking about it six months ago, but we've been building for three years. And so that's the kind of thing I think it's about managing expectations. And taking one thing I would say, because I talked earlier about sort of doing everything on a shoestring budget and, and sort of making do with what you get, I would say that two things really quickly, one, the relationship with the IT department, and those that are able to help you is extremely important. Because the ability for them to go sort of above and beyond for you. And that willingness that like if they have the incremental Friday afternoon, where they don't have something else to do, and they're willing to like put in that effort to help you get to the grand vision, eventually, those Friday afternoons add up, right, or that incremental hour, those add up. And so you'd be amazed what you're able to do with, you know, six months of those or a year of those. And so that's one thing I would say is that, you know, those relationships are really important, getting people to buy into sort of where you're headed. And then two, I would say, it's also really important to think about sort of how you structure that map, right? Here's where I'm going. Here's the incremental sort of wins along the way. Maybe if some of the things are big enough, maybe you bring them up to management, and try to shine a light on sort of here's where we're going, here's our incremental, here's how far we've gotten. But I think it's sort of case by case depending on sort of how far along you are. But yeah, it's almost never, we're going to do this amazing thing from start to finish. Are you in, it's going to take four years, because no one want to be in. But, but in reality, that's sort of what's happening behind the scenes constantly. You're sort of taking whatever you can scrap together to start the project, whether there's a ton of buy in yet or not. And then iterating quickly, as he mentioned earlier, if you're going to fail, you know, try to figure out early in your one, is this even gonna work?
Yeah, and I imagine, especially over the past couple years, you know, AI has become a big hot thing, you know, the CEOs and the CFOs and CIOs or you know, the reading the trade press, you know, they are reading Forbes and we want to be this way too. We want to be data driven, and we want to go through this digital transformation and know what the customer is thinking before they even think it. Is there a case of expectations being too high and that that's putting pressure on your departments, your efforts.
29:50 Sai Zeng
I want to talk about it, because I joined UBS last year, just for this reason, because they know AI and data, they're cool. They want to build a data science team. It doesn't mean they know what data science team should be doing. So I was hired for that purpose is really, you know, starting from scratch, building data science, the mission, the team, and also pipeline use cases and so on. So I think that's one perspective, you know, people know, it's cool. You know, it takes time and journey to figure out what kind of value can bring back to you. Right. So that's one thing. I think the another perspective is also when people are looking at the AI, they always think about is just some kind of cool models, you know, you can run behind the curtain. And also you spend a lot of time just during the modeling and so on. But I would say it's, it's the is reality is that we spend a lot of time just majority of time just crunching the data. Right. And modeling is easier part if my data is getting right. So I think that’s another angle of this. So just wanted to share. Yeah.
Look, I think the only thing I learned is education. Like it whether it's senior execs business in general company in general groups in general lights, the education aspect to it like goes it's constant, right. I think there's a perception that, like AI, or machine learning is not a panacea for every pain you have. Right. So that, so that's why the education is important. Right? So I think and that's what we do, right? So whether it is through town halls through sessions, like we have AI meetups internal and things like that, were where you actually tried to tell what is possible and what is not possible and where you apply AI or where you do not apply AI, it’s an application development problem, right. And you set the expectations you constantly calibrate, right? The pressure is always there, right. So as I said, like, the expectations are high from AI and machine learning, especially when every second day is news come out, like ChatGPT. Okay, when I, when am I getting the ChatGPT and automatic responses and also that like, so that's how you start setting the expectations. Right, what is possible and where the technology is, and what can we deliver? And that's where education, I think, is a crucial element to that.
And Rajeev, you just made a great point, calling it AI maybe, you know, is advisable that maybe not call it I call it up, you know, an initiative to predict what Yes, yeah, you know, yeah. Call it something else..
You're right. And I think what you're not, you're not selling at least the way we approach it, I approach it and the company general, so we're not selling AI, right. You're trying to solve business problems, what you're trying to sell is that how do you solve business problems, right? Sometimes it's a combination of machine learning, RPA application development. So at least I've never built a solution where I take the models deployed, and then they run independently on their own right? There's a whole solution architecture goes, which goes along with it, there's a workflow component that goes along with it, right? So that's where technology InfoSec data architecture, all of these come along, right? So at least, I can't think of any, like, I've gotten front right of CIO or anyone and said that, okay, like, here's the AI, right? So it's gonna solve all your problems, right? So what you're trying to solve for is the business problems, the moment you start selling AI, I think most of the times those projects failed, right? If you're not solving a business problem.
33:14 Sai Sharanya
Yeah, like what Rajeev mentioned, it's, it's about education as well, because often times I've seen as we, like, the data scientist team actually build out models. But when it comes down to like putting in production or getting a buy in from your product team, that's where they kind of get stuck, because that's where we need to like, get alignment from everyone make sure we have hit our targets. Make sure like the ML solution is actually helping out the problem which the product team, which we're trying to achieve. So that's something which I found has to be aligned and education across different across organizations is, if necessary, and just had another thought in my mind. And think…
Yeah, I'm happy to jump on that. So I mean, you mentioned earlier like, does the word AI? You know, does that make sense to C suite? And do they understand a lot, I've been told to use the word algorithms, and that works better, supposedly. So I don't know if that helps. But we use I always, always come up with an interesting algorithm to solve this. Oh, and so algorithms is a term we use a lot, whether it's AI machine learning, like matching engines, okay. Algorithm, it's an algorithm. Oh, so that's, that's what we say internally a lot of times, and then two, I think, you know, I think it's about figuring out what you can, what you can get them on board with and sort of, it's gonna vary depending on your manager about the appetite, right? Or the C suite, the board's whoever you're, you're sort of presenting to their appetite for wanting to understand the AI like some people are super excited. They're really interested in it, they want to understand that sort of mechanism, how's it working. And then others obviously have very little interest. And they just want to know that it's working. But I think to something which you said earlier, it was just about if you're solving the business need, that's how I've always found that you can get the buy in, right? And something I said earlier around, like, if the client is happy, management will be happy. And so like that, that's always been my end goal is like, can I make them excited about what we're doing? Can I show they're going to, you know, start to interact with our content a lot more engaged with our analysts a lot more. If I can get to the end user and get them bought in, nobody in between here, and the end user is going to have much say, because they've already voted, right? They voted with their business, they voted with their feedback mechanisms. So your feedback mechanisms will be different. And each, you know, everybody's business is different. But if you can get to that end user, and get them excited about what you're doing, and get some of that to filter back in, it doesn't matter, sort of who's standing in the way though, eventually have to go along because it's working.
Okay, great. We're actually running low on time, when to see if there's any questions for our panelists from the audience. We got three banks and a sportswear company here. So ready to answer your questions about their implementations and what it took to sell AI to the business? Bring AI to their businesses? Any questions? Anybody?
Very thorough, very thorough.
Hi, how's it gone? Nice Brooke, I had a question for Neil I think, you mentioned kind of hiding stuff until it's like ready for the primetime? How do you manage that with like, still building excitement? And like, not losing all your investment or losing all that build up?
Yeah, hiding is the wrong word. I mean, you know, that's friends and family here. But I think I think it's, it's a matter of, listen, that depends. Different companies work different ways. But a lot of times, at least in my where we work. There's a series of meetings that you can have with management, right? In different times, you can put your hand up and say, I want to be in that meeting. Or you can say, maybe not this week, you know, how about next week? How about next month? And so I think it depends on sort of how aggressively do you put your hand up? And how, how active Do you want to be in that, right? And how often do you shine the light on the projects. And sometimes you shine a light aggressively, when you when you've got big wins when you have finally getting close to fruition, and you have been able to pull things up. And then other times, you know, the thing that you've been able to do in the last month is small and the senior people won't be excited about it, because it doesn't solve the need yet. And it's five steps from solving the need. And they'll be like, That's great. Keep working, keep working. Like it took a long time to realize that I needed enough things in the portfolio that I always had something to shine the light on like, hey, this went live and then Awesome. Okay, let's not talk about the one that's going to take another six months to build. We are working on it. If you want to talk about what's everything in the portfolio, we'll do that. They never asked for that. They just keep getting excited about the win that just happened. I don't know if that makes sense. But it's sort of trying to keep things moving. You just keep all the I use the analogy of the plates spinning. I keep all the plates spinning. Some plates are you know very close for people to be excited the plates spinning and then others they haven't even realized that there's a plate yet, but I'm already trying to spin it. Does that make sense?
It does, thank you.
Anybody else? Okay, great. There's no more questions. Thank you to our esteemed panelists. Neil Sai Zeng, Sai Sharanya, and Rajeev. it was a great job.