Accelerate your career with the 90 Day Mentoring Challenge → Learn More

Data, Ethics, and the Future of AI in Europe

Data, Ethics, and the Future of AI in Europe

Send me a Text Message here

FULL SHOW NOTES
https://podcast.nz365guy.com/618  

Chris Huntingford joins us on the Ecosystem Show to dissect the rapidly evolving landscape of AI regulation in Europe, focusing on the anticipated launch of DSBs in February 2025. We promise you a deep understanding of the implications for high-risk AI solutions and how EU scrutiny might reshape the industry. From distinguishing between rules-based and autonomous AI agents to exploring Microsoft's latest offerings, we shine a light on AI's integration into the workforce and the crucial need for reliable data to avoid unintended consequences. Our discussion isn't just about technology; it's a thoughtful examination of AI's role in transforming how we work and live.

As we turn our attention to the future of Power Automate, we envision a world where intelligent agents operate in the background, enhancing productivity while maintaining the essential human touch. Ethical concerns surrounding digital equity and the traditional economic model challenge us to consider a more inclusive future. We also explore a future where humanoid robots efficiently maintain urban spaces, raising important questions about manufacturing and control. Join us as we emphasize the importance of embracing digital equity and adapting to technological change, while regulatory frameworks like the EU's AI regulations guide our global trajectory. This episode is a compelling blend of caution and opportunity, urging you to ride the wave of innovation wisely.

90 Day Mentoring Challenge  10% off code use MBAP at checkout https://ako.nz365guy.com

Support the show

If you want to get in touch with me, you can message me here on Linkedin.

Thanks for listening 🚀 - Mark Smith

Chapters

00:01 - Exploring AI and Autonomous Agents

09:35 - Future of Power Automate and Agents

17:51 - Navigating the Future of AI

Transcript
WEBVTT

00:00:01.080 --> 00:00:02.685
Welcome to the Ecosystem Show.

00:00:02.685 --> 00:00:05.160
We're thrilled to have you with us here.

00:00:05.160 --> 00:00:11.153
We challenge traditional mindsets and explore innovative approaches to maximizing the value of your software estate.

00:00:11.153 --> 00:00:13.523
We don't expect you to agree with everything.

00:00:13.523 --> 00:00:17.472
Challenge us, share your thoughts and let's grow together.

00:00:17.472 --> 00:00:19.525
Now let's dive in.

00:00:19.525 --> 00:00:20.548
It's showtime.

00:00:20.548 --> 00:00:22.864
Welcome back, welcome back.

00:00:22.864 --> 00:00:25.231
As you can see, it's no longer just me this time.

00:00:25.231 --> 00:00:31.152
I'm back with Mr Chris Huntingford in the house.

00:00:31.152 --> 00:00:45.527
Chris, I was meant to actually tell you, before we started the show, a bit of news offline, and I don't know whether I can say it now because it was private message to me, literally just before you talked to me.

00:00:45.527 --> 00:00:47.847
But at the end what I'm gonna do?

00:00:47.948 --> 00:01:01.026
at the end of this recording, I need to tell you what the message is okay, this always scares me because with you, if you're excited about something, it's either sketchy or hilarious or absolutely amazing it's amazing.

00:01:01.686 --> 00:01:06.375
it involves dynamics, minds and an absolutely epic speaker.

00:01:06.375 --> 00:01:13.884
If you could get an epic speaker out of Microsoft, I've been given three names and said which one would you go for, and one's like a no-brainer.

00:01:14.605 --> 00:01:15.487
Okay, I'm keen.

00:01:15.487 --> 00:01:16.370
I would love to know.

00:01:16.992 --> 00:01:18.802
Yeah, anyhow, that aside.

00:01:18.802 --> 00:01:30.591
That aside, sir, I just feel like the world's exploding at the moment around AI and I mean I'm for want of a better term balls deep in it.

00:01:30.591 --> 00:01:31.182
It seems.

00:01:31.182 --> 00:01:45.448
At the moment, everything is about AI and what I'm doing, and I know that in Europe there's some big changes coming in this space.

00:01:45.448 --> 00:01:46.210
What do you know?

00:01:47.500 --> 00:01:53.810
So I am going to give a shout-out to Ioana, who we should include her in the show notes.

00:01:53.810 --> 00:01:56.487
So, ioana Tanasa I hope I said her surname correctly.

00:01:56.487 --> 00:02:01.751
She is doing a lot of stuff with Microsoft on the DSB.

00:02:01.751 --> 00:02:06.727
Now, if you don't know what a DSB is, I'm not going to tell you, right, because this is going to be another episode.

00:02:06.727 --> 00:02:18.508
But the DSB is really important to them as it is one of the central mechanisms for the shipping room to allow a solution to go through into the public.

00:02:18.508 --> 00:02:26.250
So, anyway, what I found out is that this kicks off in Feb 25, right.

00:02:26.250 --> 00:02:29.569
And what's really interesting is that they're not playing games.

00:02:29.569 --> 00:02:31.145
This is not GDPR, right?

00:02:31.145 --> 00:02:34.289
This is not like, oh okay, you are kind of okay with your data.

00:02:34.289 --> 00:02:46.521
What I understood by the notes that were fired over to me is that a public-facing AI solution can get actually hauled out by people in the EU, especially high-risk AI solutions for testing.

00:02:47.544 --> 00:02:53.141
So imagine this you create a I'm going to call it a bot, because I don't think it's a co-pilot.

00:02:53.141 --> 00:02:54.265
You create a bot.

00:02:54.265 --> 00:02:56.551
The bot is both deterministic and non-deterministic.

00:02:56.551 --> 00:03:09.306
So you're looking at a solution that has both defined outputs and non-defined generative outputs, and you don't ground that solution in good data, like the air canada story where it was giving away free flights.

00:03:09.306 --> 00:03:11.091
So you don't ground that solution in good data.

00:03:11.091 --> 00:03:14.507
You build your own slm, slash, llm.

00:03:14.507 --> 00:03:19.224
You don't wrap right or responsible ai standards around it and then the bot starts going a little rogue.

00:03:19.344 --> 00:03:20.046
Right, like it doesn't.

00:03:20.046 --> 00:03:23.084
It doesn't go rogue, it just remember it is generative.

00:03:23.084 --> 00:03:28.402
It is like very, very enthusiastic intern who has no ethics training.

00:03:28.402 --> 00:03:32.652
So it starts answering questions because, remember, all it does is vector matching.

00:03:32.652 --> 00:03:34.705
It's just giving you the next word in the sentence.

00:03:34.705 --> 00:03:36.290
It doesn't know.

00:03:36.290 --> 00:03:42.427
And what will happen is that that gets popped up and, as an example, if I say how do I steal a car?

00:03:42.427 --> 00:03:45.320
And then I go and steal a car, who is liable?

00:03:45.320 --> 00:03:48.603
Is it the customer whose things on the flight?

00:03:48.603 --> 00:03:49.984
Like what are the rules right?

00:03:49.984 --> 00:04:03.835
So we're starting to find that actually there's going to be some focus that needs to be had around how people build these things out and, as I said, if it is a high risk solution dealing in certain data, you can actually get hauled out by the EU for it.

00:04:03.835 --> 00:04:07.837
I don't know if there's like an EU hit squad or whatever, but what I do know is that it can happen.

00:04:08.419 --> 00:04:10.824
Yeah, well, with the what was it?

00:04:10.824 --> 00:04:26.810
7% of revenue is one of the fines, 30 mil or 7% you could see how it could become a nice revenue generator right as in to put a police force in to track this stuff down, so to speak.

00:04:27.379 --> 00:04:37.791
Yeah, but AI is taking our jobs, Mark, so Ha, honestly, I laugh in a way and then in a way I'm like for me.

00:04:37.791 --> 00:04:51.552
In the last week it's become crystal clear to me, and the further I've drilled into the concepts of agentic AIs.

00:04:51.552 --> 00:04:58.649
Oh yeah, and you know, microsoft, of course, have made a move in public around agents.

00:04:58.649 --> 00:05:16.033
And when I look at the agents that Microsoftrosoft is touting and I gave an example from mckinsey, um, this this week that's dropped, um, I'm like that's the baby rattle of agents.

00:05:16.473 --> 00:05:34.427
It's, it's the, it's the little toy of agents like and they're kind of excited about it and they're calling it autonomous and I'm a bit like, okay, that's a stretch, right, it's a massive stretch because, if you look at, in the research I've done, agents come in three different formats.

00:05:34.427 --> 00:05:43.149
You have a rules-based agent, which I feel this is what this is which ultimately, if you distill it down, is an if-then type agent.

00:05:43.509 --> 00:05:43.831
That's all.

00:05:43.831 --> 00:05:44.291
It is dude.

00:05:44.291 --> 00:05:48.055
We're not close to actual autonomous agents.

00:05:48.055 --> 00:05:48.536
It's already.

00:05:48.536 --> 00:05:51.990
You know what it is it's Power Automate with a text front end.

00:05:52.779 --> 00:05:53.884
Yeah, yep, yep.

00:05:53.884 --> 00:05:57.721
So you can say do this and it'll build out, which.

00:05:57.721 --> 00:06:09.266
I'm glad you said that, because I did a post last night, a video post, on this and what I see coming like a freight train towards us around Power Automate in a moment.

00:06:09.266 --> 00:06:12.589
So the second type of course is semi-autonomous.

00:06:12.589 --> 00:06:25.987
So this here requires a human in the loop, always as part of it, and it's not so much a human in the loop from a compliance perspective, but it needs human input.

00:06:26.427 --> 00:06:29.449
Right, it needs human input in some form, some way.

00:06:29.449 --> 00:06:33.591
It might be just a very final step, but it needs a nod.

00:06:33.591 --> 00:06:35.706
And then you've got autonomous.

00:06:35.706 --> 00:06:42.771
And if you look at autonomous, the definition of it or agentic, the definition of it is that it is self-learning.

00:06:42.771 --> 00:06:56.595
It will have parameters, that needs to operate in, but it will learn from errors, mistakes, unique nuances, and it will become more.

00:06:56.595 --> 00:06:59.548
No, in other words, it'll build its knowledge over time.

00:07:00.211 --> 00:07:05.848
So imagine, a call center gets a very unique request and that wasn't in the help or manual.

00:07:05.848 --> 00:07:15.468
It goes okay, let's solve that for future, let's get that help built out, let's do whatever needs to be done, and so, therefore, it becomes more intelligent over time.

00:07:15.468 --> 00:07:37.026
What I see and this is one of the goals that I'm aiming for myself is that how do I introduce agents into every part of my life and how do I do it without waiting to mid-2025, 2026, when this is just going to be everything right that is going on in business?

00:07:37.026 --> 00:08:03.485
And so what I've started doing is writing standard operating procedures down to the detail for a thing that I do Right, and then what I am doing is handing that SOP off to a VA, a virtual assistant, so a real, live human person in another country, and for the next three months.

00:08:03.485 --> 00:08:06.271
They are running that every single day.

00:08:07.052 --> 00:08:14.922
They have to and, at the end of the day, they tell me where they have updated the sop, why they made the change and why they improved it.

00:08:14.922 --> 00:08:21.033
Now, in six months time, let's say, agents are now at that point.

00:08:21.033 --> 00:08:33.471
I'm going to take that sop, which has been tried and tested and validated with humans, and I'm going to feed it in, and all of a sudden, an agent's going to do that.

00:08:33.471 --> 00:08:44.812
And then OpenAI brought out the concept of a swarm API last week, and so I can say okay, agent one, now multiply yourself by 100.

00:08:44.812 --> 00:08:47.207
And I want you to all run in parallel.

00:08:47.940 --> 00:09:02.721
Do you remember that drawing that I did for you a few months ago when I was at Microsoft and I said to you this is what it's going to look like right, with the little colorful people and agents behaving badly, if you get the standard operating procedures right.

00:09:02.721 --> 00:09:09.144
So that agentic rules layer that we drew in the middle, I remember, had all the agents, but I'll see if we can put it in the chat for this episode.

00:09:09.144 --> 00:09:12.164
Yes, yes, yeah, that agentic rules layer.

00:09:12.164 --> 00:09:15.326
And I said this is the thing that people are going to need to invent.

00:09:15.326 --> 00:09:17.427
That's effectively your standard operating procedures.

00:09:17.427 --> 00:09:23.048
Yeah, but you digitize that and you make it the thing that governs the agents.

00:09:23.048 --> 00:09:25.110
So it's like an agent that governs agents.

00:09:25.809 --> 00:09:33.652
So, yeah, an orchestration agent is going to be needed as well, right as in, which will be overseeing all those agents running.

00:09:33.652 --> 00:09:35.753
So I'm quite excited about what that means.

00:09:35.753 --> 00:09:37.734
But just back on the Power Automate.

00:09:37.734 --> 00:09:42.695
So I said in the post yesterday that I believe Power Automate is going to explode.

00:09:42.695 --> 00:09:51.738
Yeah, but for everybody looking at going, oh God, I'm going to build a career around Power Automate in five years' time, I don't think we'll virtually ever talk about Power Automate.

00:09:51.738 --> 00:09:56.740
It will be a tool that runs in the background and it will be fully automated.

00:09:56.740 --> 00:10:03.073
All the automations will be built out by an agent that will act on our behalf.

00:10:03.073 --> 00:10:04.988
So I don't see Power Automate going away.

00:10:05.681 --> 00:10:11.532
I see it becoming hyperscale, but I don't see it being you and I building out automations.

00:10:11.532 --> 00:10:14.349
I see our agents who build out our automations on our behalf.

00:10:15.120 --> 00:10:17.789
Yeah, we'll need to contextualize the automations.

00:10:19.280 --> 00:10:23.265
So there'll be the concept of observability we know that and then there'll be the concept of context.

00:10:23.486 --> 00:10:32.107
Right, the context of contextualized data and contextualized Power Automate is really where I think it's going to be at, and it kind of does it now to an extent.

00:10:32.107 --> 00:10:40.826
So that's where, if you think about the history of what happened, right, you had these little templates that you can use in Power Automate and then you go generate a template and provide context templates that you can use in Power Automate and then you go generate a template and provide context.

00:10:40.826 --> 00:10:45.010
Now you can use Copilot to generate a Power Automate, but you still have to provide context.

00:10:45.010 --> 00:10:52.405
You can have an agent that generates many chained Power Automate flows, but you still need to provide context.

00:10:52.405 --> 00:11:17.327
And I feel like you know Lewis Baybutt actually, lewis and I have been speaking about this a lot the power of contextualized information, and when I say information, it could be data, metadata, whatever you want to call it right, yeah, that's going to be where we become really important, because, although the co-pilot or the agent may have some context, we have all the context yeah and I feel like that's what's going to make it different.

00:11:18.070 --> 00:11:27.206
Like your us, your usp is being a human with context yeah, yeah, and I think the human element is going to be ever increasing important.

00:11:27.206 --> 00:11:50.514
But one of the other nuances I've worked out and it's something that I riffed off what Satya said in London, and it was this that you'll have personal agents, you'll have team agents, you'll have company agents and you'll have industry agents, and each will be endowed with, you know, applicability.

00:11:50.514 --> 00:11:58.025
But here's the thing is that let's take that swarm scenario that all of a sudden I've got 100 agents doing this one thing for me.

00:11:58.025 --> 00:12:26.466
I can see businesses, and it goes back to a podcast that we probably had nine or so months ago where I said the capitalist concept needs to evolve because if it's just the corporation that benefits off a new tool that means we can pay workers less or reduce their hours and therefore make the profits off that, that's not going to be fair and equitable.

00:12:26.485 --> 00:12:51.711
Going forward, there needs to be a model that if you can get the same output from a human, but at 50 of the input time of that human, you should still compensate them for the full 100 and let them the freedom to spend and do their time elsewhere, which is a real funny thing, because what I'm noticing in the enterprise deals that have been done in the market.

00:12:51.711 --> 00:12:58.113
The term productivity is not a selling point for AI, right?

00:12:58.113 --> 00:13:12.489
Because one of the comments I heard from a CXO was this is that so this person shaves 45 minutes off their day because they're using AI to do it, does that mean they're going to spend another 45 minutes around the water cooler in the office, right?

00:13:12.668 --> 00:13:13.130
Yeah.

00:13:13.660 --> 00:13:23.104
It's not benefiting me as an organization, which I can see it, but I'm also like that's the capitalist talking right there, right, that's the oh.

00:13:23.104 --> 00:13:27.607
We need to suck up all the money that we can make off our human resources as possible.

00:13:27.607 --> 00:13:52.316
Now this tells me that the future state world that if you can be a freelance or independent consultant and you can understand agents and scale them for yourself, could I decide to reduce my work week to 20 hours but produce 40 hours of output?

00:13:52.316 --> 00:13:57.866
And when I mean output, I mean let's just say 40 hours of revenue equivalence.

00:13:58.389 --> 00:13:58.809
Yeah, you can.

00:13:58.809 --> 00:14:00.365
I know somebody that does it already.

00:14:01.181 --> 00:14:05.626
But then could I produce 500 hours of revenue equivalents a week.

00:14:05.947 --> 00:14:12.663
This is where this is where the ethics, slash, blah, blah, discussion comes in and I think, okay.

00:14:12.663 --> 00:14:18.764
So somebody said to me the other day can I, can I generate a virtual staff force that gets paid?

00:14:18.764 --> 00:14:26.836
And I was like like, well, ethically it's probably not a good idea because you're not going to receive that many paychecks.

00:14:26.836 --> 00:14:29.163
But it goes back to the story of the boats in the dock.

00:14:29.163 --> 00:14:32.712
Right, like they couldn't get this boat out of the dock.

00:14:32.712 --> 00:14:33.721
There was something wrong.

00:14:33.721 --> 00:14:35.888
They sent many consultants in.

00:14:35.888 --> 00:14:36.850
Nobody could fix it.

00:14:37.351 --> 00:14:38.041
They called Bob.

00:14:38.041 --> 00:14:46.470
He walked in with a spanner, whacked it twice, billed them for a hundred grand and they said but you only spent one minute doing it.

00:14:46.470 --> 00:14:48.544
And he said, well, but I have the expertise.

00:14:48.544 --> 00:14:51.107
And this is where I think you need to get smart right.

00:14:51.107 --> 00:14:55.660
I think that I actually did this to an extent when I was at Microsoft, like I automated my role there.

00:14:55.660 --> 00:14:56.981
Yeah, I really did.

00:14:56.981 --> 00:15:01.947
I did hackathons all the time because I basically automated my washing my face and brushing my teeth process.

00:15:01.947 --> 00:15:10.975
And I think if you can, as a smart person, automate the generic generative stuff, and if other people haven't thought to do it, that's their problem.

00:15:11.635 --> 00:15:12.356
Here's the problem.

00:15:12.356 --> 00:15:35.652
It shouldn't be just their problem, because what we're going to do is create a massive divide between the concept of knowledge, as power is going to become even more of a thing, and you think there's a lot of people in the world that haven't had the digital exposure that we've had and therefore, are we just like, oh well, you're screwed, like we can't do that?

00:15:35.652 --> 00:15:40.626
Right, we can't, for a human race as such, we can't take that stance.

00:15:40.626 --> 00:15:49.148
And of course, I can already hear the people going oh, you're a como or uh, you're a socialist in that type of thinking.

00:15:49.148 --> 00:16:03.192
But I tell you what those people that say that will be bawling their eyes out when they realize it's going to sweep them away as well, the wave right that is coming and their ability for them to earn.

00:16:03.192 --> 00:16:05.745
Because you know, do you know what I don't think?

00:16:05.745 --> 00:16:06.868
Ai is the big deal.

00:16:06.868 --> 00:16:27.032
It's not robotics is the big deal, because you think if we, you know, if we all of a sudden have a robotic workforce which is a human labor as and I'm talking about a humanoid robot and they get produced for, let's say, ten thousand dollars, and you're not going to buy two or three of those, right, you're not going to like.

00:16:27.032 --> 00:16:36.524
At the moment we go I, I, when I open my browser, I have six tabs open every time and there's six different llms by different providers.

00:16:36.524 --> 00:16:37.765
Blah, blah, blah and I.

00:16:37.765 --> 00:16:47.265
I can switch because the cost of the premium access to them are negligible in my cost operating model on a day.

00:16:47.265 --> 00:16:48.769
So I'm like I own them all.

00:16:48.769 --> 00:16:57.147
You're not going to own five different robots by five different providers, right, you're going to choose one and you're going to hold onto it for five to 10 years potentially.

00:16:57.147 --> 00:17:00.442
But I tell you what that robot is.

00:17:00.442 --> 00:17:23.354
Literally so much of the physical labor stuff in the world that we do is going to be able to be done by it and therefore you just think of the implications If you live in I think of West Hampstead, where I used to live in London, and all the things that involve physical labor in West Hampstead the rubbish truck goes down the road, the inspectors, all that kind of stuff.

00:17:23.500 --> 00:17:37.480
All of a sudden that all becomes robotic humanoids and they are tasked with, let's say, not picking up the rubbish on the street or not collecting the garbage in the garbage trucks.

00:17:37.480 --> 00:17:44.282
They're targeted with making sure the street stays immaculate, so when they knock over that rubbish bin.

00:17:44.282 --> 00:17:46.810
They don't leave it on the ground like you'd currently see now.

00:17:46.810 --> 00:17:48.546
They pick up every dot.

00:17:48.546 --> 00:17:51.035
They notice their, you know, their, their, their.

00:17:51.035 --> 00:17:58.672
Their infrared scanner picks up that um, the street sign is about to fail because the bolts out on it.

00:17:58.672 --> 00:17:59.834
And guess what?

00:17:59.834 --> 00:18:03.646
They automatically created a work order for somebody to come through after hours and fix that.

00:18:03.646 --> 00:18:08.728
They see that bit of graffiti and they've got the device to go clean that graffiti on the spot.

00:18:08.728 --> 00:18:18.929
Like all of a sudden you're in a different world, right of what is possible, and I think it's just you know.

00:18:18.929 --> 00:18:21.747
But then here's the other thing who's going to create all those robots?

00:18:23.123 --> 00:18:24.247
Yeah, who's going to make the robots man?

00:18:24.539 --> 00:18:27.410
Is it going to be an American company or is it going to be a Chinese company?

00:18:27.410 --> 00:18:43.116
And then, if you had a labor force of, let's say, 1,000, let's say a million robots, is there going to be a concern about who the manufacturer is?

00:18:43.116 --> 00:18:50.724
And do they have a nefarious program instruction that could be onboarded to that robot force that is now in a foreign country?

00:18:50.724 --> 00:18:52.329
I don't know man.

00:18:52.329 --> 00:18:56.164
It raises some crazy stuff right when you talk about red teaming.

00:18:56.686 --> 00:18:59.326
Yeah, it's right there, but I don't know.

00:18:59.326 --> 00:19:05.894
This whole thing right now is moving at an incredible pace.

00:19:05.894 --> 00:19:10.108
Somebody asked me recently, actually asked me today are you scared of it?

00:19:10.108 --> 00:19:13.983
I said, well, no, I'm not, because there's just.

00:19:13.983 --> 00:19:17.785
You know what dude Think of it like this man.

00:19:17.785 --> 00:19:20.002
Media has done this, not anything else.

00:19:20.002 --> 00:19:21.705
It's just media has made this, not anything else.

00:19:21.705 --> 00:19:24.752
It's just media has made this way more sensational than it really is.

00:19:24.752 --> 00:19:38.094
I mean, I agree AI is a little scary and the products of AI will be a little scary, but I think ultimately, it's just taking it in our stride man and remembering digital equity, remembering that we need to be looking after.

00:19:38.094 --> 00:19:41.510
I love what Trevor Noah said at our platform conference.

00:19:41.510 --> 00:19:44.628
He's like stop protecting jobs and start protecting people.

00:19:45.151 --> 00:19:47.607
Yeah, it's a different paradigm shift, isn't it?

00:19:48.079 --> 00:19:48.260
Mate.

00:19:48.260 --> 00:19:49.266
This is the thing.

00:19:49.266 --> 00:19:55.489
So I said on Twitter one and a half years ago.

00:19:55.489 --> 00:19:59.515
I said, if I was you all, I would start learning Power Virtual Agents.

00:19:59.515 --> 00:20:08.449
This is me telling you that this is going to happen, and I've gone to companies I've worked with and I'm like, I'm telling you that this is what you need to learn.

00:20:08.449 --> 00:20:15.528
These are the things you need to understand and, unfortunately, those who haven't listened on our scrambling yeah.

00:20:15.567 --> 00:20:16.250
I did my best.

00:20:16.250 --> 00:20:17.211
I told them what to do.

00:20:17.211 --> 00:20:18.580
I gave them the information I had.

00:20:18.580 --> 00:20:18.901
And what to do?

00:20:18.901 --> 00:20:19.682
I gave them the information I had.

00:20:19.682 --> 00:20:32.595
And although even I'll bet the imposter syndrome being pretty big going oh shit, like am I saying the right thing Turns out yes, right, but I think that there's a very big gap between digital inequity and laziness.

00:20:33.095 --> 00:20:33.336
Yeah.

00:20:33.915 --> 00:20:47.548
Interesting and refusing to understand or acknowledge what's happening in, in what's going on around us, whether it be robots or ai or steam cars turning into petrol cars.

00:20:47.548 --> 00:20:48.590
Right, like I don't to me.

00:20:48.590 --> 00:20:52.506
I think designing the defaults of the next bunch of people is up to us.

00:20:52.506 --> 00:21:08.321
Right, and all we have to do is educate, but if people won't listen, man, I'm not gonna I'm not gonna sit there and bang my head against the rock, like one of the orgs that I I work with, and one of the people in specific in this organization that I work with, and you know the arrogant story I got told.

00:21:08.321 --> 00:21:17.614
I was arrogant for not wanting, because I refused, to sell the meeting they didn't go through a DSP and responsible AI, and this person is now on stage presenting about responsible AI.

00:21:17.614 --> 00:21:22.124
I'm like, oh no, but I just find it interesting.

00:21:22.124 --> 00:21:26.705
So respect to them because they haven't been lazy, although I don't think they know what it means or how to do it.

00:21:27.500 --> 00:21:34.446
What I do think is important, though, is that these people are acknowledging it, and even at Power Platform Conference, dude, I did not hear a single person talking about Rai.

00:21:34.446 --> 00:21:36.372
No, nothing.

00:21:36.372 --> 00:21:47.423
I was the only one in that entire crowd that, actually, and you, actually you and you, we were the only people that got up and said something about it, and it's that type of thing that is removing the ambiguity of what's happening around us.

00:21:47.423 --> 00:22:07.829
So again I'll go back to the fact that, sure, I believe in the digital equity piece, but actually I think there's such a thing as like refusing to learn and laziness, and then I cannot be responsible for those people yeah, and and that's why, honestly, it's it's I see a very different future and it's an exciting future.

00:22:08.151 --> 00:22:25.714
But I tell you what you need to apply yourself now because, as I say, you know I feel it is like a wave and you know you can catch a wave at the right time and you're going to get an epic ride, or you can be too far in and you're just going to get the dumping on you of that wave Right and you're going to go.

00:22:25.714 --> 00:22:26.695
What the fuck happened?

00:22:26.695 --> 00:22:32.300
I'm being tossed around like crazy, and that's why I think a wave is a good analogy.

00:22:32.300 --> 00:22:44.941
I think there's a window of time now to really, if you're in our space, apply yourself, folks and get skilled, because there's a massive op.

00:22:44.961 --> 00:22:45.909
I'm going to read this to you.

00:22:45.909 --> 00:22:47.372
Okay, I know you've read this, but I want people to read this, yeah.

00:22:47.372 --> 00:22:48.961
Yeah, I'm going to read this to you, okay, I know you've read this, but I want people to read this.

00:22:48.936 --> 00:22:49.115
Yeah, yeah.

00:22:50.852 --> 00:22:51.773
EUA applies.

00:22:51.773 --> 00:22:58.505
Euai applies to any organization that puts an AI system into service in the European Union, regardless of where its basis is.

00:22:58.505 --> 00:23:01.753
Any organization that uses an AI system in the EU.

00:23:01.753 --> 00:23:10.165
Providers and users outside of the EU where the output of the AI is intended to be used in the European Union.

00:23:10.165 --> 00:23:14.640
Important distributors of AI and systems product manufacturers.

00:23:14.640 --> 00:23:18.176
Timeline this blew my brain up.

00:23:18.176 --> 00:23:22.580
25 Feb, the sentiment analysis prohibition.

00:23:22.580 --> 00:23:32.891
The sentiment analysis prohibition that's four months away.

00:23:32.891 --> 00:23:33.192
Four months.

00:23:33.192 --> 00:23:38.163
All these people doing c-sat, with their surveys out and doing sentiment analysis well, good luck.

00:23:38.163 --> 00:23:38.971
How do I know this?

00:23:38.971 --> 00:23:40.193
I choose to read things.

00:23:40.193 --> 00:23:44.031
And then enforcement of general purpose ai models in august 25.

00:23:44.031 --> 00:23:47.984
Okay, so here's the thing we to read this.

00:23:47.984 --> 00:23:50.092
We are connected into the people that understand this.

00:23:50.092 --> 00:23:51.015
We have chosen to read this.

00:23:51.015 --> 00:23:52.038
We understand this right.

00:23:52.038 --> 00:23:58.391
So the companies we work with are therefore affected by the knowledge that we have, because we are curious people and read things.

00:23:58.833 --> 00:23:59.032
Yeah.

00:23:59.494 --> 00:24:04.931
Mark, what happens to the people that don't read this and refuse to understand what's happening?

00:24:06.173 --> 00:24:15.798
Yeah, yeah, this is what I feel that companies, particularly companies that are Sorry.

00:24:15.798 --> 00:24:39.425
I'll give you a use case A company that I heard about this week said they don't want to do AI, and the reason they don't want to do AI is because their IT department said they know that their data is in such a state that their employees would find out stuff that they didn't want to know.

00:24:39.425 --> 00:24:44.362
So I was like okay, so this sounds like classic security by obscurity.

00:24:44.362 --> 00:24:47.814
It's all exposed there, you know it is so what we're going to do.

00:24:47.814 --> 00:24:53.740
They're like listen, if we took on AI, we'd have an 18-month project of having to clean up our systems.

00:24:53.740 --> 00:24:56.858
Oh my gosh, people, do you think that's going away?

00:24:56.858 --> 00:25:09.380
Do you think that your staff are not going to BYOD AI into your organization and do you think that you're protected because of this?

00:25:10.141 --> 00:25:26.876
I'm just like, oh, like crazy level of ignorance and like, oh, let's just bury our head in the sand, it'll go away the thing that the thing that makes me laugh, dude, is that number one okay and like a laugh in a uh no type of senses life is that you're just spot on.

00:25:26.876 --> 00:25:29.303
But the thing is, this is only going to get worse.

00:25:29.303 --> 00:25:35.077
Right, like the AI right now is optional, yeah, like it's optional in the things you're using.

00:25:35.077 --> 00:25:39.517
Like we're lucky we have a prompt box okay, because that will go away eventually as well.

00:25:39.517 --> 00:25:41.936
Like you will just have agents behind the scene doing everything.

00:25:41.936 --> 00:25:51.459
Okay, so we're lucky when these organizations are smashed with differential ai that lives under the hood and just does everything for them, but they have terrible data.

00:25:51.459 --> 00:25:55.898
You thought power bi was allowed, hayler for shit data yeah, yeah yeah.

00:25:56.039 --> 00:26:11.041
So I just think if, if I were to put this in in a summary right now, and I had to say to people you know, learn and get educated, I would say, well, sort your data out first now, like don't, don't wait, you know, this was said, this was said by many people.

00:26:11.041 --> 00:26:11.603
We're in year one.

00:26:11.603 --> 00:26:13.574
Okay, we've.

00:26:13.574 --> 00:26:14.455
We had this year.

00:26:14.455 --> 00:26:17.063
We were given this year as a grace period to figure things out.

00:26:17.063 --> 00:26:24.871
Right, if you haven't figured it out after this year, you can have some problems next year yeah, yeah.

00:26:24.891 --> 00:26:37.364
so I I can't believe the timing Microsoft had with Fabric, like really thinking it in that broad sense, and how rapidly the story around Fabric has grown.

00:26:37.364 --> 00:26:40.227
That's pretty mind blowing to me.

00:26:41.490 --> 00:26:42.334
It's data centricity.

00:26:42.334 --> 00:26:43.676
Man, that's all they've been trying to do.

00:26:43.676 --> 00:26:44.779
One late right.

00:26:44.779 --> 00:26:46.855
Yeah, yeah, like it's genius.

00:26:47.369 --> 00:26:48.336
Very smart, very smart.

00:26:48.336 --> 00:26:49.250
It is been trying to do one late, right?

00:26:49.250 --> 00:26:49.460
Yeah, yeah, like it's, it's genius.

00:26:49.460 --> 00:26:49.664
Very smart, very smart.

00:26:49.664 --> 00:26:49.769
It is.

00:26:50.431 --> 00:26:53.320
The whole idea of the show, of course, wasn't to be uh, morbid.

00:26:53.320 --> 00:27:00.278
I'm I'm bloody excited about this future and and and the implications of it are mind-blowing.

00:27:00.278 --> 00:27:17.400
I mean, sam altman said in february this year that we are probably going to see in our lifetime, a billionaire created that didn't employ any staff, so a billionaire of one that has built a billion-dollar company without any staff.

00:27:17.400 --> 00:27:22.182
And I believe absolutely that's going to be possible.

00:27:22.182 --> 00:27:30.880
And so, therefore, I think there's in front of us lies a whole new opportunity about how we earn, create money, how we create wealth.

00:27:31.650 --> 00:27:41.378
And I honestly believe that we're going into an extreme abundance period in history, abundance like we've never known before.

00:27:41.378 --> 00:27:55.218
And I find that exciting because, if you let your mind riff off, right, most people limit themselves in life because, oh, I can't afford it, right, it's one of the most limiting mindsets and behaviors.

00:27:55.218 --> 00:27:56.099
I can't afford that.

00:27:56.099 --> 00:28:01.323
Imagine you go to a scenario where you don't even think like that anymore.

00:28:01.323 --> 00:28:05.119
There's no need to think like that because of the abundance that's there.

00:28:05.119 --> 00:28:11.993
But you know, yeah, we've got an amazing opportunity right in front of us.

00:28:12.675 --> 00:28:13.057
I agree.

00:28:13.057 --> 00:28:18.535
I couldn't agree with you more and I, honestly, man, I think so.

00:28:18.535 --> 00:28:32.115
I feel very passionate with you about this right Because I know we have to wrap up, but I think I've never felt so driven to do something different as I have in tech before Right, and that's why I'm enjoying working with the customers.

00:28:32.115 --> 00:28:40.640
We get to work with them I'm not going to mention that, but we do get to work with some pretty damn amazing orgs and I'm seeing the results of what they're doing in society around me.

00:28:40.640 --> 00:28:41.790
So it's wild, great timing.

00:28:41.810 --> 00:28:43.311
Cheers, man, we'll see you on the next one.

00:28:43.311 --> 00:28:44.153
Rock and roll Great timing.

00:28:44.153 --> 00:28:44.373
Cheers man.

00:28:44.373 --> 00:28:45.413
Thank you, We'll see you on the next one.

00:28:45.874 --> 00:28:46.253
Rock and roll.

00:28:47.234 --> 00:28:49.096
Thanks for tuning into the Ecosystem Show.

00:28:49.096 --> 00:28:55.162
We hope you found today's discussion insightful and thought-provoking, and maybe you had a laugh or two.

00:28:55.162 --> 00:29:01.166
Remember your feedback and challenges help us all grow, so don't hesitate to share your perspective.

00:29:01.166 --> 00:29:09.480
Stay connected with us for more innovative ideas and strategies to enhance your software estate.

00:29:09.480 --> 00:29:11.289
Until next time, keep pushing the boundaries and creating value.

00:29:11.289 --> 00:29:12.875
See you on the next episode.