Accelerate your career with the 90 Day Mentoring Challenge → Learn More

Proactive AI governance on the Power Platform with Sid Gundavarapu

Proactive AI governance on the Power Platform with Sid Gundavarapu

Send me a Text Message here

FULL SHOW NOTES
https://podcast.nz365guy.com/602  

Ever wondered how Microsoft ensures ethical AI in its business applications? Join us as Sid Gundavarapu, a principal product manager at Microsoft, shares his remarkable journey and invaluable insights into proactive AI governance on the Power Platform. Sid brings a wealth of experience from his extensive career, starting from his early days on the CRM team to his current pivotal role in AI governance. This episode offers a unique behind-the-scenes look at how Microsoft maintains its edge in a rapidly evolving technological landscape, emphasizing the importance of ethical AI.

Beyond the tech talk, Sid gives us a glimpse into his life outside of Microsoft, revealing his passions for family, food, and fun. With candid anecdotes and memorable industry moments, Sid's narrative is both engaging and inspiring. He even shares the story behind his iconic ZZ Top-inspired beard! Whether you're a tech enthusiast, a business professional, or just curious about the human side of innovation, this episode has something for everyone. Tune in for a conversation that's as enlightening as it is entertaining.

90 Day Mentoring Challenge  10% off code use MBAP at checkout https://ako.nz365guy.com

Support the show

If you want to get in touch with me, you can message me here on Linkedin.

Thanks for listening 🚀 - Mark Smith

Transcript
WEBVTT

00:00:01.826 --> 00:00:06.589
Welcome to the Co-Pilot Show where I interview Microsoft staff innovating with AI.

00:00:06.589 --> 00:00:12.910
I hope you will find this podcast educational and inspire you to do more with this great technology.

00:00:12.910 --> 00:00:15.887
Now let's get on with the show.

00:00:15.887 --> 00:00:24.827
In this episode, we'll be focusing on proactive and AI governance on the Power Platform, a very interesting topic at this time.

00:00:24.827 --> 00:00:28.408
Today's guest is from Redmond Washington in the United States.

00:00:28.408 --> 00:00:40.283
He works at Microsoft, of course, as a principal product manager for the Power Platform and AI governance, and he loves helping build the best business application platform of the planet.

00:00:40.283 --> 00:00:46.334
And you can find, of course, links to his bio and show notes, social media etc.

00:00:46.334 --> 00:00:47.201
For this episode.

00:00:47.201 --> 00:00:49.587
Welcome to the show, sid hey.

00:00:49.627 --> 00:00:51.631
thanks for having me, mark, I'm excited.

00:00:51.972 --> 00:00:57.500
Now, you see, I lifted that line from your LinkedIn helping build the best business application platform in the planet.

00:00:57.500 --> 00:00:59.185
And this is a crazy thing, right?

00:00:59.185 --> 00:01:02.232
Is that I could never go and work for Salesforce?

00:01:02.232 --> 00:01:14.765
It's my DNA wouldn't allow me to work for Salesforce and you know, I've done over 20 years now in biz apps and you know it surprises me, it's a different thing, right.

00:01:14.765 --> 00:01:29.768
But at the vendor level, when I see, you know, like we were just off air discussing Juja, you know pre-James Phillips and he went to Salesforce, and I'm just like treason you know that's how passionate I am about it Like even Bob went, but didn't Bob Stutz?

00:01:29.768 --> 00:01:31.269
Didn't he go to Salesforce as well?

00:01:31.811 --> 00:01:33.893
He did yes, I think he did right.

00:01:33.893 --> 00:01:37.076
And then Juja, and then Juja went and James Phillips.

00:01:39.402 --> 00:01:39.683
Thank goodness.

00:01:40.224 --> 00:01:41.951
James Phillips didn't go there.

00:01:41.951 --> 00:01:42.680
Yeah, so sorry.

00:01:42.680 --> 00:01:42.900
We digress.

00:01:42.900 --> 00:01:44.481
Food, family and fun what do they mean to you, sid?

00:01:45.242 --> 00:01:52.951
I'm a family guy, so not like the family guy, family guy I'm not that crazy, pretty crazy, but I'm a big family person.

00:01:52.951 --> 00:01:59.123
Most of my off work is with the family Nice, so that's very big Food.

00:01:59.123 --> 00:02:01.066
I used to be a big foodie.

00:02:01.066 --> 00:02:05.352
Now I am restraining myself as I grow older.

00:02:05.352 --> 00:02:12.326
I'm like saying no to more things than not, but I do like good food.

00:02:12.326 --> 00:02:14.790
I cook, sometimes decent.

00:02:14.790 --> 00:02:26.407
I mean I'm not going to be opening any restaurants anytime soon, maybe not in this lifetime though but certainly love food, love family and, yeah, all things.

00:02:26.407 --> 00:02:27.631
What's the third one?

00:02:31.259 --> 00:02:31.540
I love it.

00:02:31.540 --> 00:02:32.883
Now, those of you listening, anoka, you can't see Sid.

00:02:32.883 --> 00:02:39.787
I can see Sid because he's on video here and he's got the most gangster ZZ top inspired beard you've ever seen.

00:02:39.787 --> 00:02:53.655
And why you know I respect that beard is because my daughter is now four years old, which meant four years ago, after she was born, I shaved my bed off, which was the same length as Sid's.

00:02:53.655 --> 00:03:00.222
At that time and now it's got me thinking should I bring back the bed Because Sid's bald?

00:03:00.222 --> 00:03:01.907
I'm bald and it's a good look.

00:03:01.907 --> 00:03:02.407
I like it.

00:03:02.407 --> 00:03:03.169
I like it, sir.

00:03:03.450 --> 00:03:04.132
Yeah thank you.

00:03:04.132 --> 00:03:08.070
You've been in the Microsoft Biz Apps team for a heck of a long time.

00:03:08.070 --> 00:03:09.847
When did you kind of first move into that team?

00:03:10.520 --> 00:03:14.730
I moved into Biz Apps in 2013, 14 timeframe.

00:03:15.072 --> 00:03:15.231
Yeah.

00:03:15.641 --> 00:03:16.445
It was CRM.

00:03:16.445 --> 00:03:28.294
I was in the CRM arc as a product manager Title was a program manager back then and I worked on some of the customer service focused areas.

00:03:28.775 --> 00:03:39.348
Yep, so you know they bought a product back then from the gaming industry Heck, I can't even Do you remember what it was and a lot of it got wrapped into customer service.

00:03:39.348 --> 00:03:41.709
It was like the knowledge management piece.

00:03:41.709 --> 00:03:42.481
It was the.

00:03:42.481 --> 00:03:43.967
Do you remember what that product?

00:03:43.987 --> 00:03:44.389
was called.

00:03:44.389 --> 00:03:45.816
There were actually a couple of acquisitions back then.

00:03:45.816 --> 00:03:46.182
One, the knowledge management piece.

00:03:46.182 --> 00:03:46.230
It was the.

00:03:46.230 --> 00:03:46.566
Do you remember what that product was called?

00:03:46.566 --> 00:03:47.070
There were actually a couple of acquisitions back then.

00:03:47.070 --> 00:03:48.372
One, the knowledge management.

00:03:48.372 --> 00:03:49.395
One Parature.

00:03:49.599 --> 00:03:50.802
I don't think that's from Parature.

00:03:50.802 --> 00:03:51.288
That was the product.

00:03:51.288 --> 00:03:52.039
I'm thinking, parature.

00:03:52.842 --> 00:04:00.480
I don't remember that being a gaming one, though, but there was another one which got acquired that was more like from the marketing, which had gamifications all of that.

00:04:00.941 --> 00:04:05.682
When I say gaming, parature's biggest client was the gaming industry.

00:04:05.682 --> 00:04:08.843
It was used as support in their products.

00:04:08.843 --> 00:04:33.175
I think there was a number banded around back then that they had 10 million active users in that Parature product, but it was designed specifically for the gaming industry rather than yeah, I know the gamification came up around the time that the field service field one was purchased and I think in about that same three to six month period a gamification product was purchased at that time.

00:04:33.696 --> 00:04:33.915
Yeah.

00:04:33.935 --> 00:04:35.255
It's a long time ago.

00:04:35.755 --> 00:04:36.557
It was a long time ago.

00:04:36.557 --> 00:04:40.218
Yeah, I would have stumbled the name, but suddenly it clicked Parature.

00:04:40.218 --> 00:04:42.725
Yeah, it's not a name we use more often.

00:04:42.927 --> 00:04:46.000
Yeah, yeah, yeah, it's so long ago.

00:04:46.000 --> 00:04:47.846
Why have you stuck with the biz apps team for so long?

00:04:47.846 --> 00:04:51.887
Because you know I've seen a lot of people come and go and biz apps over my time have been involved with it.

00:04:51.887 --> 00:04:52.988
Why are you still?

00:04:53.009 --> 00:04:53.230
there.

00:04:53.230 --> 00:04:56.822
I'll give a cliched answer, but also a different answer as well.

00:04:56.822 --> 00:05:04.872
I like the work I do, so that's a cliched answer, the more honest, more ground reality is.

00:05:04.872 --> 00:05:13.485
The BizApps world keeps evolving as everything around the world changes.

00:05:13.485 --> 00:05:29.000
One of the forerunners of this change driving across are the enterprises, and BizApps are fundamental for enterprises to drive this change, or the transformations, and being in that, you're never dull Like.

00:05:29.000 --> 00:05:37.752
You have something exciting to work on every single day and with the new AI thing coming up, it just gets accelerated.

00:05:37.752 --> 00:05:43.232
So that is why I stuck, because I did not find a reason why I should not.

00:05:43.699 --> 00:05:44.759
Yeah, I love that.

00:05:44.759 --> 00:05:48.305
People often ask me you know, should I do a career in the power platform?

00:05:48.305 --> 00:05:59.185
And I was like you know, I've been doing it for over 20 years and if I just learned about it today, but kind of knowing what I know now, I would be in boots and all.

00:05:59.185 --> 00:06:03.663
Because you're right, it keeps changing, it doesn't get I feel it doesn't get stale.

00:06:03.663 --> 00:06:11.266
It just, you know, james Phillips, of course, bought a massive amount of transformation to the business and that was an exciting time.

00:06:11.387 --> 00:06:19.752
And you know, I remember at a conference I was in in Florida and I hit up Ryan Jones, who at that time was, you know, who owned Dataverse.

00:06:19.752 --> 00:06:25.951
I think it might have been still called CDS back then and I was starting to get into AI heavily then.

00:06:25.951 --> 00:06:31.629
And this is I don't know how long ago it was, but I was just like, what are you going to do in Dataverse for AI?

00:06:31.629 --> 00:06:33.745
Because isn't that where?

00:06:33.745 --> 00:06:37.047
Like, if you look at it from a data, like what will it look like?

00:06:37.047 --> 00:06:39.120
I didn't know, I didn't have ideas.

00:06:39.120 --> 00:06:43.872
I was just like there must be something epic going to happen in space and must be something epic going to happen in space.

00:06:43.872 --> 00:06:48.137
And of course now I feel like oh, hello, tell us about what you do with AI now and governance.

00:06:49.040 --> 00:06:50.201
So with AI.

00:06:50.201 --> 00:06:53.047
So it is transformational, the new age of AI.

00:06:53.047 --> 00:07:37.447
It's really powerful and there's a lot of investigation going on to tap in and see what are the various applications of this power that we have, and quite a few of my colleagues are currently working on tapping into this power and figuring out what are the meaningful experiences and productivity gains that we can provide our users, and I and our team are focused on how to do that safely, so that it does not disrupt businesses, and you know very well, not everybody, because we have a new technology.

00:07:37.447 --> 00:07:42.529
Not everybody is going to jump in and then say, hey, I'm going to be pioneering this on day one.

00:07:42.529 --> 00:07:49.874
There are multiple levels of maturity within the enterprises and multiple adoption levels as well.

00:07:49.874 --> 00:08:07.189
So how all of this new cutting-edge technologies can be applied to the spectrum of the maturity levels and the adoption levels all of these in a safe, compliant manner that is going to yield the benefit that we hope it does.

00:08:07.839 --> 00:08:15.524
Yeah, and that's a massive responsibility, right, because things can go wrong, and so I can understand the need for that.

00:08:15.524 --> 00:08:44.669
Now, when you say acting responsibility, we're using phrases like red teaming as part of how you think about this, or what's your thought process around ai governance, and I suppose what I'm asking here is do you see that this is going to help administrators, particularly manage their tenant and environments more efficiently around things you know, in other words, reduce their workload around administrative function?

00:08:44.669 --> 00:08:54.989
That needs to be done, but it's not creative work, right, just needs to be done as keeping you know, as we used to say, keeping the lights on making sure it's optimized, safe, all that type of thing.

00:08:55.870 --> 00:09:04.964
We do multiple things actually, so the example that you gave is one aspect of it, like driving the productivity of one persona.

00:09:04.964 --> 00:09:08.826
The example you gave is for an administration point of view or an administrator point of view.

00:09:08.826 --> 00:09:16.990
How can AI help me as an administrator to improve my own productivity or improve my day-to-day?

00:09:16.990 --> 00:09:23.089
And as an administrator, that is one gain I can get, and we are venturing to this area gradually.

00:09:23.089 --> 00:09:44.692
We don't have a significant capability yet for administrators to leverage AI for their day-to-day, but the other side of the administration is, as an administrator, I have a significant responsibility within my organization to ensure everything my business functions properly and efficiently.

00:09:44.692 --> 00:10:06.961
So one of the things that, as an administrator, I need to do is make sure that this new AI features are available for the most required audience, and I also do the necessary checks and balances to make sure that these AI features, or whatever the capabilities, are not just AI.

00:10:06.961 --> 00:10:21.451
Any capability in general is allowed not allowed for various reasons within my organization, within my domain industry, whatever and we are looking into how we can accelerate those aspects as well.

00:10:21.491 --> 00:10:30.203
From an administration standpoint, how can we get the administrators the right information very quickly so they can make these kind of decisions.

00:10:30.203 --> 00:10:32.500
They can get educated, they can make decisions quickly.

00:10:32.500 --> 00:10:44.589
They can then scale these decisions across their tenant in a logical way rather than haphazard and ad hoc way, and then they also need to once they roll it out or once they enable.

00:10:44.589 --> 00:10:54.115
They also need to ensure that the decisions that were made few weeks, months, days ago is actually yielding the results.

00:10:54.115 --> 00:10:56.889
So that means that adoption of this feature is going great.

00:10:57.460 --> 00:11:02.629
The results, the ROI is there, the value is there and the users are using it the right way.

00:11:02.629 --> 00:11:06.003
There's nothing goofy going on with this beast called AI.

00:11:06.003 --> 00:11:09.870
So, as an admin, I need to understand all of those as well.

00:11:09.870 --> 00:11:17.811
So the other area we're looking into and we're building on is giving this kind of visibility to the admins to see who are all using it.

00:11:17.811 --> 00:11:19.380
What are they using it for?

00:11:19.380 --> 00:11:22.006
What is the ROI for these features?

00:11:22.006 --> 00:11:25.559
Is it really worth the investment we are making with this?

00:11:25.559 --> 00:11:33.706
So it's a gamut of things that we are looking into to give the right capability for the administrators from the governance standpoint.

00:11:33.725 --> 00:11:44.086
So my question on that was going to be is what you're doing totally transparent, but you're saying no, you're going to light up in the interface stuff that an administrator can be proactive with.

00:11:44.086 --> 00:11:44.729
Is that right?

00:11:44.729 --> 00:11:53.631
So it's not so much that you're making sure that AI is being used safely, you're actually going to provide some insights to the admin around.

00:11:53.631 --> 00:12:00.453
And are you looking at it from how is AI being used in the context of Dataverse, or is it broader than Dataverse?

00:12:00.860 --> 00:12:08.453
It is broader than Dataverse, so it is across all of the biz apps or the novice business and industry co-pilots organization.

00:12:08.961 --> 00:12:11.629
There are various co-pilots, not just Dataverse.

00:12:11.659 --> 00:12:22.268
Dataverse is one significant co-pilot platform that we have that powers many of the co-pilots from the data point of view but there are numerous other co-pilots as well and the generative ai features.

00:12:22.788 --> 00:13:04.816
So this governance need to be broad and wide enough to give that visibility across the board, not just from one specific co-pilot point of view, and the visibility is again going to be as transparent as it can get, without having any information that the admin should not even know, like the privacy information, some of this information that may not be relevant for the administrator, all of the checks that we have, the responsibility filters that we have are all functioning, and then giving that visibility for them to see that these checks actually work.

00:13:04.816 --> 00:13:07.764
Like if there is something harmful going on, I should be able to see.

00:13:07.764 --> 00:13:19.929
If there is something I would not say illegal, something that is potentially harmful is going on, that I should be able to see and to what degree I can see, should also be available for the administration.

00:13:19.929 --> 00:13:20.491
Part of you.

00:13:20.491 --> 00:13:22.966
But again, this is going to be a journey.

00:13:22.966 --> 00:13:31.307
So all of what I said may not be available on day one, but that is the direction we are heading towards and next few months we should be there, so I might have got something wrong here.

00:13:31.639 --> 00:13:32.964
I thought so.

00:13:32.964 --> 00:13:39.589
When I see governance right, I think managed environments, and so do you not sit in the managed environments team?

00:13:39.589 --> 00:13:41.121
Are you in a another team?

00:13:41.142 --> 00:14:04.285
that's specifically more what like co-pilot studio kind of area yeah, we're managing moments is a big area for us so you are in jones's org yeah, let me demystify for folks I don't know if this makes it to the cut, but for folks who are interested in managed environments is not a separate team, so we have what we call a scale.

00:14:04.346 --> 00:14:22.763
As an organization or the group, we focus on governance for our platform and, as part of this, managed environments is this horizontal that everybody in need basis, need to build towards and managed environments being our approach to our scale.

00:14:22.763 --> 00:14:27.893
Yes, everyone is actively working on managed environments.

00:14:27.893 --> 00:14:34.224
Everyone is actively working on non-managed environments because, yeah, we cannot have a feature which is managed environments only.

00:14:34.224 --> 00:14:59.592
There should be still ways for customers to be able to do many of the things that is required for their business and, as a feature product manager, I should think about both of these things I like it coming back to the question, I am part of the organization which works heavily on managed environments, but this governance in itself is going to be managed environments plus.

00:14:59.860 --> 00:15:03.326
So this significant portion of this is going to be for all customers.

00:15:03.326 --> 00:15:09.427
There will be some aspects of this that's going to be managed environments, again, for quick messaging.

00:15:09.427 --> 00:15:11.647
I don't know when this was last told.

00:15:11.647 --> 00:15:15.546
Managed environments is all about more visibility, more control and less effort.

00:15:15.546 --> 00:15:20.203
So anytime I can do something I can do faster with managed environments.

00:15:20.203 --> 00:15:24.307
Anything I can see I can see quickly with managed environments, I can see more with managed environments.

00:15:24.307 --> 00:15:32.775
So in that sense, yes, so we do work on both aspects and we apply what goes in there as part of our prioritizations.

00:15:39.899 --> 00:15:43.731
So, but if I want to see the AI governance piece it's going to be in the admin interface, is where you're going to light up or provide anything for whoever's adminning that system.

00:15:43.751 --> 00:15:44.033
That's correct.

00:15:44.033 --> 00:15:44.595
That's the current plan.

00:15:44.595 --> 00:16:00.923
Yeah, Brilliant, that totally makes sense and I'm glad you've clarified that thinking around managed environments, because you know, I had Ryan on a couple of weeks ago and of course he talked about we're in the scale org and you, you know it's kind of funny how your mind just goes.

00:16:00.923 --> 00:16:01.745
Now, what part of the product is that?

00:16:01.745 --> 00:16:07.528
Rather than no, it's a broader philosophy almost across a range of things, and I think that's brilliant.

00:16:08.149 --> 00:16:09.354
We work across the board.

00:16:09.354 --> 00:16:11.620
So, yeah, we talk to every single product.

00:16:11.620 --> 00:16:18.623
In fact, we are organizationally part of power apps org and we work on quite a few of the power apps as governance capabilities.

00:16:18.623 --> 00:16:19.576
But we work with quite a few of the Power Apps governance capabilities.

00:16:19.576 --> 00:16:25.591
But we work with the governance PMs for Power Automate, copilot Studio and some of the dynamics as well.

00:16:25.591 --> 00:16:28.168
So that way we are horizontal.

00:16:28.168 --> 00:16:30.947
Yeah, you know, capabilities are the persona.

00:16:30.947 --> 00:16:35.510
We own the persona, but we also build some ourselves.

00:16:36.302 --> 00:16:40.783
So if we are thinking about proactive governance, what does that mean to you, the?

00:16:40.822 --> 00:16:43.892
multiple levels of proactive governance.

00:16:43.892 --> 00:16:55.355
The non-start for the proactive governance is I create maybe a policy or something and that will prevent a situation from happening.

00:16:55.355 --> 00:16:57.400
That is a true proactive Like.

00:16:57.400 --> 00:17:06.192
For example, if I one classic example I like this because it resonates with many of the customers the apps that are built.

00:17:06.192 --> 00:17:10.076
Let's say, I built a few apps in Microsoft and I leave the organization.

00:17:10.076 --> 00:17:16.624
Now these apps are ownerless in a sense.

00:17:16.624 --> 00:17:29.270
There may be users for this app and if this app happens to be a business critical app, then all of a sudden it introduces a business risk as I leave the company because if something has to happen to this app, there's no single responsible person who will maintain this.

00:17:29.852 --> 00:17:32.557
So what we have today?

00:17:32.557 --> 00:17:37.009
We have the COE starter kit that can identify these apps that are ownerless.

00:17:37.009 --> 00:17:44.750
We have the platform advisor that can do this and I can go in and I can reassign these apps.

00:17:44.750 --> 00:17:48.064
I can see the list of apps and I can reassign them to somebody else.

00:17:48.064 --> 00:17:50.230
I can promote some of the co-owners to somebody else.

00:17:50.230 --> 00:17:51.421
I can assign them to managers.

00:17:52.162 --> 00:18:05.008
However, if I want to be truly proactive, then I can set my rules saying that hey, if an app has to become proactive, has to become ownerless first.

00:18:05.008 --> 00:18:06.593
I would prevent that from happening.

00:18:06.593 --> 00:18:16.003
So as a product, you should say you need at least a security group or something which can give a better cover and even with that the chances of getting into situations lower.

00:18:16.003 --> 00:18:17.686
But still there are possibilities.

00:18:17.686 --> 00:18:24.490
So what are my plan A, plan B, plan C, all of this as a plan that I can specify.

00:18:24.490 --> 00:18:31.866
And then I have to just say I'm going to watch for my reports coming in, because I know that this situation is going to be handled when it happens.

00:18:31.866 --> 00:18:37.301
Now all I need to see is, when it happens, I'm alerted, I'm going to report, I'm going to be reported on it.

00:18:37.301 --> 00:18:41.893
So that, in a sense, is, in my opinion, is a good proactive strategy.

00:18:41.893 --> 00:18:43.483
We are not there yet.

00:18:43.483 --> 00:18:44.426
We are working towards that.

00:18:45.009 --> 00:18:45.170
Yeah.

00:18:45.170 --> 00:18:58.303
So then when we look then, from the difference between proactive governance and AI governance which proactive governance seems to me to be a lot around, rule-based, if this happens, do that, if that happens happens, do that, if that happens, do that?

00:18:58.303 --> 00:19:15.130
You know, you're like, if then that that type of scenario where ai governance and see, once again I looked at when I saw ai governance I'm like you're using ai to govern the platform, but actually I just realized it's actually no, it's how do you govern ai?

00:19:15.130 --> 00:19:18.317
In context, it is both okay, so how do you govern AI in context of the platform oh actually yes, it is both Okay.

00:19:18.357 --> 00:19:23.426
So how do you differentiate and how do you think about those two different nuances of the same phrase?

00:19:24.000 --> 00:19:48.983
Yeah, so using AI to govern, like, for example you actually brought up this earlier However awesome we feel about a product, there are mundane tasks to be done, so we want our admins to automate all of these as much as possible, to a level of their comfort, of course, with some visibility, and so on.

00:19:48.983 --> 00:19:54.224
So now, all of these automations, all of these mundane tasks, need to be automated.

00:19:54.224 --> 00:19:56.073
There are certain ways you can automate them today.

00:19:56.073 --> 00:20:00.596
Using Power Automate, you can create workflows on top of the admin connectors and so on.

00:20:00.596 --> 00:20:07.705
But an intelligent AI is going to help me, like, for example, one of the examples I suggested some time back.

00:20:08.046 --> 00:20:11.347
Let's say, I created a new environment in my tenant.

00:20:11.347 --> 00:20:21.510
Now I, as an admin, I may have a checklist or maybe a whiteboard with some to-dos that whenever a new environment is created.

00:20:21.510 --> 00:20:53.162
These are the 10 things I need to do Now, as an intelligent AI, what I should be doing is, instead of having the admin look at the whiteboard for this checklist, the AI should be able to generate this checklist based on my existing setup or my existing landscape and say hey, this is your landscape, you did this activity, this is your next best activity and these are your next best further following activities that you need to perform, and do you want me to do it, and do you want me to do it every time?

00:20:53.622 --> 00:20:53.843
Yeah.

00:20:54.522 --> 00:20:56.604
So do you want me to do it now for this time?

00:20:56.604 --> 00:20:57.703
Do you want me to do it every time?

00:20:57.703 --> 00:20:59.344
Yeah, so do you want me to do it now for this time?

00:20:59.344 --> 00:21:00.505
Do you want me to do it every time?

00:21:00.505 --> 00:21:09.739
I'm going to trust AI enough that this is a task that I'm going to automate, and then, of course, with some reporting back, I can see that it all did, with the ability to roll back a step or everything else.

00:21:09.739 --> 00:21:18.624
So that is one example of AI assisting with the governance of the tenant, of AI assisting with the governance of the tenant.

00:21:18.624 --> 00:21:36.932
The other aspect of the AI governance is what we talked earlier, where I, as an admin, in addition to doing my tasks, I also need to keep an eye on what's happening with my tenant and with AI being the new major focus for everyone, at least within Microsoft and quite a few of our customers.

00:21:36.932 --> 00:21:43.997
As an admin, am I doing the right things with this new set of things that are coming up?

00:21:43.997 --> 00:21:47.654
Yes, so that is the second aspect of AI governance.

00:21:48.445 --> 00:21:48.866
I like it.

00:21:48.866 --> 00:21:51.452
I love that clarity between the two.

00:21:51.452 --> 00:22:04.019
And one thing you highlighted there with that checklist is very interesting because I think it's the biggest growth area in the next two to three years in AI, which is really what I distilled down on my mind from that.

00:22:04.019 --> 00:22:21.173
This is an agent, basically, and I see that we've been using AI for the last couple of years from the gen AI kind of starting point as it's become into the, the public consciousness, to generate things like video, images, text, that type of thing.

00:22:21.173 --> 00:22:37.553
But we haven't seen a lot of ai to action right, using ai to go do something, and I think that's such an exciting space in that it doesn't forget to do things if it's part of human nature is.

00:22:37.854 --> 00:23:03.694
And I read years ago this book called the checklist manifesto and one of the case studies and it was doctors were often the biggest secondary infection in patients because, just of a little thing, they didn't put their glove on the correct way, because they've been doing it millions of times right, and so they introduced this concept of checklists for every single step in prepping to carry out an operation and secondary infection rates just plummeted.

00:23:03.694 --> 00:23:12.891
And what's found is that the more somebody becomes familiar with the job, the more likely they're to forget something that was actually a critical step.

00:23:12.891 --> 00:23:14.476
That's going to have ramifications.

00:23:14.476 --> 00:23:34.008
What I like about this whole agentification kind of scenario and what you're having in governance here is that it's not going to figure those things and you're going to get a consistent, I think quality standard start to appear from the repeatability of whether it's that environment, deployment, those checks, balances, reporting, et cetera that's going to come into place.

00:23:34.670 --> 00:23:35.290
Absolutely, yeah.

00:23:35.290 --> 00:23:44.557
So one of the interesting things about this agentic framework that is coming up so I can train the agent, the agent can also train me, right?

00:23:44.557 --> 00:23:47.173
So the other way is also equally important, right?

00:23:47.173 --> 00:24:07.134
So, as Microsoft, we believe there are some best practices, which probably are true to some degree, but ground reality, you know very well, the situation is sometimes very different than what Microsoft believes is a best practice to the ground reality of that particular customer.

00:24:07.134 --> 00:24:27.973
So, in that sense, the agent that since we're using this agent term, the agent may have some pre-knowledge with some best practices that can be applicable for majority of the customers, but for the unique situations, I can train the agent so it can do these tasks for me again.

00:24:27.973 --> 00:24:28.967
So that's a.

00:24:28.967 --> 00:24:36.333
It is more customizable in a sense and can help me solve these unique situations instead of always having to do these cookie cutter jobs.

00:24:36.805 --> 00:24:39.974
Yeah, which I think is so important, and it is like human nature, right.

00:24:39.974 --> 00:24:46.989
You go out, you study to become an architect, for example, and you see what good looks like industry, best practice, all those type of things.

00:24:46.989 --> 00:24:47.951
It doesn't mean.

00:24:47.951 --> 00:24:54.491
It's kind of like when you know the rules, you know when you can break the rules right, but you've got to understand what the rules are, the implications, all that type of stuff.

00:24:54.491 --> 00:24:59.159
And same with you know there's global learnings, which I think you know.

00:24:59.179 --> 00:25:10.892
What I love about success by design as part of the FastTrack team and what they have there is they're looking at what success looked like over I don't know 30-odd thousand deployments that have happened right, so there's patterns that come out of that.

00:25:10.892 --> 00:25:22.634
Now, it doesn't mean every pattern is going to apply to my situation, because that's the personalization I love.

00:25:22.634 --> 00:25:31.806
The two of those and you know coming together will, of course, create a personalized experience in context of best practice, but also understanding the way our organizational business is running and what's pertinent to us yeah, it's cool, yes, it is cool, but again, it's a journey.

00:25:31.945 --> 00:25:34.430
So we have probably taken quarter of a step.

00:25:34.430 --> 00:25:35.613
You're, we're not even one step.

00:25:35.613 --> 00:25:39.138
And it's a journey and the potential is there.

00:25:39.138 --> 00:25:52.921
We need to tap into the potential and we need to take the right baby steps so as to not scare people away and to make sure we first crawl safely before we stand up and walk.

00:25:53.505 --> 00:25:54.704
Yeah, that's so right.

00:25:54.704 --> 00:26:11.795
What I like, I suppose, about what you've said, though, today, is that marketing and Microsoft is always well ahead of products, and so we have Copilot, copilot for the last two years, da-da-da-da-da-da, and I'm like, yeah, there's some good demos in that, but where I'm thinking it should be, it ain't there yet.

00:26:11.795 --> 00:26:14.616
And then also, where are the practical use cases?

00:26:14.616 --> 00:26:25.442
And I think what we've discussed today is some real practical use cases where this ai infusement is gonna, you know, make us better, be able to do better, safer that type of thing, absolutely.

00:26:25.442 --> 00:26:38.395
And then that other part of what you've brought up is that, you know, the whole concept of crawl walk run is that it's not just about whether you can do it now, it's where, whether people are with you on that journey.

00:26:38.596 --> 00:26:44.954
For now, and I suppose it was Apple's kind of launch stuff around what they're doing with AI that really honed that home to me.

00:26:44.954 --> 00:26:55.147
I reckon they could be a lot farther ahead than what they were, but they know they've got to bring consumer audience, which have all different types of beliefs and fears and all that type of stuff.

00:26:55.147 --> 00:26:58.876
You've got to take them on a journey and I think that totally makes sense.

00:26:58.876 --> 00:27:05.402
You know, the way we think about AI in 10 years will be nothing like today, because we've grown with it.

00:27:05.422 --> 00:27:16.454
Yeah, absolutely yeah, one of the things I just told somebody else again last week as the engineering organization, we make these iterative changes week over week.

00:27:16.454 --> 00:27:19.480
So we see this deltaative changes week over week.

00:27:19.480 --> 00:27:22.203
So we see this delta over a period of time.

00:27:22.203 --> 00:27:47.676
So when we take a step back and see the future, because we have seen every single moment of change in that, we forget the magnitude of the change that is for the end users, whereas somebody who is just watching it said, oh, we need to learn these 10 different things, whereas from our point of view it's just like, oh, these are like a week or week, we're just adding, uh, probably a point, one step, and we forget the magnitude of the change for people.

00:27:47.676 --> 00:27:59.711
So that's why it is very important to be iterative, make sure the change is not too much to consume, but we also be able to learn and then course correct along the way by making this incremental progress.

00:27:59.711 --> 00:28:00.434
I like it Okay.

00:28:01.365 --> 00:28:02.971
Let's put your day job aside.

00:28:03.573 --> 00:28:03.753
Yes.

00:28:04.244 --> 00:28:07.593
As we close now, because this is my last podcast before Vegas.

00:28:07.593 --> 00:28:11.085
I won't do another recording to next month, so I'll see you in Vegas.

00:28:11.085 --> 00:28:26.637
But I always like to ask people that are working in the area of AI for business what do you and you know this is not Microsoft's opinions or anything like that but how are you thinking about AI and perhaps practical applications to your day-to-day life?

00:28:26.637 --> 00:28:34.275
What excites you about because you know you have a certain lens of looking at being at the coalface of a lot of innovation that's happening in the space?

00:28:34.275 --> 00:28:44.290
What's your thoughts about and I'm not asking really for philosophical thoughts, but more practical how do you see it enhancing your life in the next 18 months?

00:28:44.873 --> 00:28:45.796
Oh, that's a wonderful question.

00:28:45.796 --> 00:28:50.432
I did not sit and think about this anytime, at least in the last one year, for sure.

00:28:50.432 --> 00:29:05.960
One thing, what I'm super excited about, at least from the AI point of view, is how much it is going to improve the learnability, or learning capability of people.

00:29:05.960 --> 00:29:13.765
I already see a plethora of apps which are now competing with Duolingo.

00:29:13.765 --> 00:29:16.012
I don't know if you're familiar with Duolingo, yeah, yeah yeah.

00:29:16.507 --> 00:29:17.711
For people who are not familiar.

00:29:17.711 --> 00:29:27.294
It's a language learning app and there are now so many of these AI-first apps where it is like a conversation.

00:29:27.294 --> 00:29:31.472
I'm conversing in my own tone and the AI is helping me.

00:29:31.472 --> 00:29:39.656
That is one application of AI which I'm super psyched about, because I'm trying to learn languages and I'm not super happy with the progress I'm making.

00:29:39.656 --> 00:29:42.384
I'm looking at that as a shy person.

00:29:42.384 --> 00:29:47.297
I don't talk to the native speakers, although we have native speakers in our own team.

00:29:47.297 --> 00:29:58.486
I don't speak beyond introductions, but I would love to, but because of the shyness or whatever reason, I cannot converse Now with this AI.

00:29:58.486 --> 00:30:00.410
I don't have to restrain myself.

00:30:00.410 --> 00:30:01.250
I can get wrong.

00:30:01.250 --> 00:30:08.653
I can be very, very wrong and not have to worry about whether I made sense, whether I did something.

00:30:09.034 --> 00:30:13.369
That is one application of the learnability or the learning capability of a person.

00:30:13.369 --> 00:30:14.932
There are numerous.

00:30:14.932 --> 00:30:23.204
I was blown away when I first saw the video of Khan Academy, so Khan Academy's AI application again is amazing, amazing, amazing, totally so.

00:30:23.204 --> 00:30:24.570
That's again one aspect of it.

00:30:24.570 --> 00:30:35.112
I want to see where the world goes with respect to millions of these books and how can the humanity digest this information in a much more rapid way and how AI can bring it.

00:30:35.112 --> 00:30:46.953
The other application I believe is going to probably benefit from the AI is this entire government governance aspect of it, the governance of the general citizenry right.

00:30:46.953 --> 00:30:54.392
Yes, I'm super pumped on how I know governments move a little slower than the corporates, the enterprises.

00:30:54.392 --> 00:31:04.386
But when they get to leverage this AI for the governance aspect of it, I'm super pumped on how much radical changes that is going to drive in the society as a whole.

00:31:04.827 --> 00:31:06.592
Yeah, amazing, I love it.

00:31:06.592 --> 00:31:07.755
Love Khan Academy.

00:31:07.755 --> 00:31:15.349
Got my four-year-old and two-year-old on the Khan Academy app because you know amazing stuff there and you're so right about learning.

00:31:15.349 --> 00:31:17.204
And then all that language translation.

00:31:17.204 --> 00:31:34.575
It's going to create such a rich nuance in communication, intercultures, right and that you've not been able to have because you've not had a common language almost right or not been able to have a translation of outside, of rudimentary translators that we've had in the past.

00:31:34.724 --> 00:31:48.694
I do think you know absolutely exciting times there was a video by one of the researchers quite some time back where this person is talking in English and in real time it is translated into Mandarin.

00:31:48.694 --> 00:31:51.452
Yeah, and that never got productized.

00:31:51.452 --> 00:31:55.684
It was a very interesting demo, yeah, but never got productized.

00:31:55.684 --> 00:32:12.265
I think now is the time we can get those kinds of things real, where, in real time, I can talk to you If you understand a different language, I can talk to you in my native and you hear the language you want to hear without any latency and vice versa.

00:32:12.265 --> 00:32:16.953
So that way the language barriers are going to be broken.

00:32:17.596 --> 00:32:24.037
I know, probably we are looking further, more than 20 years from now that's happening, but again, the potential is great.

00:32:24.037 --> 00:32:25.909
I think we can, you and I.

00:32:25.909 --> 00:32:26.430
We are the tech.

00:32:26.430 --> 00:32:41.006
So we see these changes happening on an incremental basis, but from as an outsider or somebody my parents' generation or somebody else who are not so much into tech when they see this change happening after a couple of years, they're going to be blown Like it's a generational shift for them.

00:32:41.426 --> 00:32:42.650
Yeah, totally Sid.

00:32:42.650 --> 00:32:44.034
Thank you so much for coming on the show.

00:32:44.034 --> 00:32:46.650
It's been so much fun talking to you, learning from you.

00:32:46.650 --> 00:32:48.174
I look forward to seeing you in Vegas shortly.

00:32:48.786 --> 00:32:49.910
Thanks, so much for having me on.

00:32:49.910 --> 00:32:50.792
Yes, we'll see you in Vegas.

00:32:56.865 --> 00:32:57.660
Hey, thanks for listening.

00:32:57.660 --> 00:32:59.113
I'm your host, mark Smith, otherwise known as the NZ365 guy.

00:32:59.113 --> 00:33:00.147
Is there a guest you would like to see on the show from Microsoft?

00:33:00.147 --> 00:33:02.596
Please message me on LinkedIn and I'll see what I can do.

00:33:02.596 --> 00:33:09.071
Final question for you how will you create with Copilot today, ka kite?