Dispelling AI Myths
Ana Welch
Andrew Welch
Chris Huntingford
William Dorrington
FULL SHOW NOTES
https://podcast.nz365guy.com/570
Can AI truly understand and manipulate our realities without us noticing? Join us on this exploration of artificial intelligence as we challenge prevalent fears and misconceptions, often stoked by media sensationalism. Chris emphasizes the importance of educating organizations to dispel these myths and highlights the productivity-boosting capabilities of tools like Microsoft's Co-Pilot. We also delve into the implications of a significant lawsuit involving Elon Musk and OpenAI, which raises crucial questions about the evolving landscape of artificial general intelligence (AGI).
Imagine a world where AGI operates behind the scenes, subtly influencing our decisions and actions. We discuss the skepticism around AGI's true capabilities and contrast Western notions of privacy with the Chinese acceptance of constant surveillance. The conversation takes an intriguing turn as we bring in Daniel Kahneman's insights, discussing how AI can significantly reduce human biases and improve decision-making processes. This chapter not only questions the reality of AGI but also brings to light the potential benefits and ethical considerations of an AI-driven future.
Are we on the brink of a technological revolution that could rival the advent of electricity? We explore the socio-economic impacts of AI, comparing it to past technological advancements that initially displaced jobs but eventually created new opportunities. The discussion underscores the importance of proactive change and continuous learning in adapting to rapid technological advancements. We also touch on the human aspect, emphasizing empathy and support for those struggling with these transitions. Through historical analogies and personal anecdotes, we illustrate the critical need for societal motivation and the pursuit of new opportunities in an increasingly AI-driven world.
90 Day Mentoring Challenge 10% off code use MBAP at checkout https://ako.nz365guy.com
If you want to get in touch with me, you can message me here on Linkedin.
Thanks for listening 🚀 - Mark Smith
00:00 - The Impact of AI Education
08:05 - The Future Impact of AI
18:59 - The Impact of Technological Change
31:53 - Navigating Society Through Technological Change
Mark Smith: Welcome to the Power Platform Show. Thanks for joining me today. I hope today's guest inspires and educates you on the possibilities of the Microsoft Power Platform. Now let's get on with the show. Welcome back to the Ecosystem Podcast. We're excited to be here again for another episode. This is brought to you by Cloud Lighthouse. We really want to unpack, or revisit, if you like, the whole area of what's going on with AI, particularly in our landscape, of what we're doing in our day jobs and how that's impacting. We're not talking about, you know, what new startup has started up in AI and how's the advancement of language models or anything like that. It's mainly more around what we're seeing in the conversations we're having with customers and stakeholders. So with that, chris, why don't I throw it to you to kick us off today?
Chris Huntingford: Yeah, thanks, man.
Chris Huntingford: I think what's been pretty apparent to me is that, well, obviously, the whole freak out about AI has changed the way in which companies and customers see technology, and I think that there's a lot of noise that we've had to cut through, which is starting to get extremely interesting, and a lot of fear-mongering we've had to remove, and I'm starting to find that, when talking to organizations, half the job is that rather than the technology, right?
Chris Huntingford: In my honest opinion. Actually, I think that anything that's going to make people big, shiny bucket loads of money in some way, shape or form and the media get hold of it's extremely hard to remove that prejudice against that technology or that thing, right? So actually, the education around artificial intelligence is incredibly important. So, actually, the education around artificial intelligence is incredibly important, and what I'm seeing today is every organization I'm talking to, we have to take them through that education layer before we do anything, absolutely anything. So I can walk you guys through later on some of the things that I'm seeing, but that's the trend and the pattern that's going on in my world.
Mark Smith: Why don't you just pull that apart at the moment, Chris? What are you referring to? What are you seeing?
Chris Huntingford: no-transcript. When you look at some of the posts, when you look at some of the articles shared, like even 2018, 2017, this is not new. I mean, we were using embedded AI and power apps in the form of ideas, like four or five years ago. Powerpoint had this years ago as well.
Chris Huntingford: But the problem is is that now that the media has surfaced it and this open AI thing has come out, everyone's like, oh my gosh, ai is going to make my toaster come to life and murder my family? And in actual fact, that's just the media, it's not the technology speaking for itself, right? So I find that when talking to organizations, even about co-pilot, the pushback from people, it's pretty damn significant, right? Like them saying no, I'm terrified of my data being out there, I'm terrified of my data being shared on the internet, I'm terrified of protection, of not being protected, and actually you know what, like, when you dig into what the tool actually does, it doesn't actually do anything special from a co-pilot perspective. All it does is just enhance productivity to an extent, but like it's remediating that fear. That's what I'm struggling with.
Andrew Welch: Did you guys see the and maybe I want to take this briefly in a direction that is not what Mark intended but did you see the lawsuit recently that Elon Musk filed and when I say recently, this might have been six weeks ago where, where he alleged so? To cut a long story short artificial general intelligence?
Mark Smith: right, that is the. That's sort of the the yeah yeah, agi.
Andrew Welch: Agi is the kind of nebulous yardstick that is used to basically say, you know, or maybe I should say the goalpost, right, that once we as a species achieve AGI, right, like, that's artificial, that's AI being able to act totally autonomously across the full range of human tasks. Now, I think that, arguably, the goalposts keep changing on what AGI is, because if you were to go back 10 years and tell someone 10 years ago what we have now, I think they'd be pretty damn sure that generative AI that is able to pass the bar exam comes pretty close to artificial general intelligence, but in any exactly, exactly. So I think that we don't fully know yet what agi looks like. Um, but to cut a long story short, uh, part of microsoft's agreement with with open ai basically says that once open ai achieves artificial general intelligence, then the licensing to Microsoft is off, right, like Microsoft can't then take that and bake it into its products.
Andrew Welch: And I think, unfortunate move, someone at Microsoft made a claim that, oh, we've reached AGI now. Right, that was an incorrect statement. It's an absolutely incorrect statement. Right, that's an incorrect statement. It's an absolutely incorrect statement, right. But this, of course, inspired Elon Musk, who was one of the founders who were involved with the founding group of OpenAI and then stormed off in a huff, as Elon Musk is wont to do. This, of course, led him to sue OpenAI right. So, anyway, the reason that I brought this up is that I think that my concerns about AI and the fears that I have for AI have shifted, I think, really dramatically, from you know, chris, what you were talking about a moment ago come alive and murder my family, or the ai is going to achieve this general intelligence and do something nefarious to the human. You know matrix like to the human species. My fear has really shifted more into the realm of what can nefarious humans do with ai?
Chris Huntingford: yeah, yeah, but dude, we come from a fucking race of people who are weaponized. A walking stick yes, like that's what we do. That's what humans do. We're super good at weaponizing or doing dumb shit with everything we can find. I mean, if there's a pine cone on the floor, somebody's going to throw it at somebody. If there's a stick, somebody's going to whack somebody with it. It's just another thing that people are going to find to do dumb stuff with.
Andrew Welch: So definitely not where Mark wanted this to go. I'm sorry.
Mark Smith: No, but what an awesome pivot, because here's the thing right, I've spent some time reading the actual court document because the US is so beautiful at publishing all that stuff.
Andrew Welch: Well, we take pride in our litigiousness, Mark.
Mark Smith: Yes, here's the thing, my observation. Elon did it as a a meme at a large scale elon did something for him as a meme at a large scale, because he can right.
Mark Smith: Because here's the thing is that power move this week, microsoft, what do they do? They announce a new startup company, that is, they're going to put a hundred million billion dollars, I think, into it. I think it's billion. And what is it to do? It's to do, um, quantum computing and ai. And who are the two partners in it? Microsoft and open ai. So like, okay, let's just move the like, yes, that was that relationship that exists, that you know what elon's suing over, which, as I say, I'm I'm pretty sure it's a mean case, as in it's because, and what's happening is that they just start a new company and people invest in that. That's got nothing to do. It's like forking code, right, and they've taken in a different path and I think that's a power move for microsoft. Back on the topic of some knob saying that they reached agi. Let's be real people. When agi is reached, we will never, never know about it until about five to ten years later 10 years later.
Ana Welch: Weren't we talking about Panopticon in a different episode? Right it's?
Andrew Welch: Panopticon yes, thank you, anna.
Mark Smith: Anybody that says they have AGI is lying, it's marketing, it's bluster. The thing is the people that reach AGI first. We will find out that we are in a simulation by the time we realize that AGI was reached right as in, because they will have basically taken over and manipulated everything, um, and hopefully in a, in a, in a in a utopian, and I definitely believe that our future is going to be brighter with not you know, not negative. As for privacy, back to your one, chris I think we delude ourselves in thinking that we have privacy, and this is one thing. When I was in China, I was in Beijing walking the Great Wall of China. I was, you know, we had a guide that took us 100k outside beijing and, uh, my what, meg and I went on this private tour of the wall that's not commercialized. And, of course, as I do right, I ask questions about what I've heard. Tenement square, you know everybody's spying, oh, mark.
Mark Smith: Because, I'm in this great wilderness location, right, my drone wouldn't even fly there, right, everything's locked down and anyhow he goes, you've got to understand Chinese people have no expectation of privacy. We don't delude ourselves with thinking that we have privacy. So it changes the way you think the problem is and I'll point the finger at the US here folks in the US think they have privacy. They're in this false sense of we have privacy. It's the panopticon, this false sense of we have. It's the panopticon. It's the panopticon, and what you have is a device that is leaking stuff constantly.
Mark Smith: I don't know much about Tucker Carlson, right, I don't know much about that dude, but I listened to the podcast episode with Lex Friedman after he had interviewed Putin and stuff, and he was talking about him as a US citizen. How often he knew that he had been bugged as in and with like just through the software on his phone, and he knows that he's really got to that opinion that the US government are always monitoring everything he does, and this is the thing. But I think we have this false sense of data privacy and I just don't believe it exists anymore.
Andrew Welch: Mark is peddling every conspiracy theory.
Chris Huntingford: I'm so happy right now. I'm so happy right now Best thing I've ever heard.
Ana Welch: But all of these conspiracy theories honestly. But all of these conspiracy theories honestly, they don't really relate to someone's I don't know beer tracking app from like I don't know what company you know, and all of these. I've had many chats with customers or prospective customers about AI within Microsoft and they're all like oh my God, our data, it's going to leak everywhere. I'm like you've got a bunch of data in an Excel spreadsheet with a lot of macros.
Chris Huntingford: It's your own fault.
Ana Welch: And in order for you to be able to actually work with it, you need to do like a save, as on your own machine, because otherwise you won't open. Like what are you even talking about?
Chris Huntingford: so this, this was exactly.
Ana Welch: Oh no exactly like you're saying like it's a walking stick. I love the analogy. That's exactly exactly what it is right like it's. It accelerates our pace a little bit. I've also been, coincidentally, reading about AI this week from a different perspective. I really used to like a lot Daniel Kahneman. If you know him, he wrote Thinking Fast and Slow.
Mark Smith: Very, very famous book, brilliant book.
Ana Welch: Yeah, right, so he is. I think he was Israeli. I want to say I'm not super sure, but he was like super into human psychology and he was talking about biases and how people, even though they know an outcome isn't going to be achieved like, for example, we've got an 18-month project to completely redesign our enterprise architecture right and they know it normally takes like three to five years. You know, they know from experience, they know from other examples, they know from the size of their organization or the problems their data have. However, they cannot start the process if they feel like this is going to happen like five years from now. Therefore, they lie to themselves and you know, the professor comes and says look, this is where AI is actually going to help us a lot. We're going to be able to make so much better decision making.
Ana Welch: Our answers to our customers are going to be so much more clear, concise and precise that, even though AI does have a limitation on, maybe, human psychology and it doesn't have emotions, et cetera, et cetera, and we do need, you know, a level of oversight and that's why we have all of these guardrails. And you know security methodology and governance, et cetera, et cetera, et cetera, and we do need to look at, like ethical social implications, et cetera, and we didn't need to look at ethical social implications. Responsible AI the reality is it's still going to accelerate our work, our thinking, our day-to-day lives by eliminating some of the bias, or some of the information that we forget in our human brains. I just loved the way he saw things. This was an article in the Economist.
Mark Smith: So it's not a new book he's read or anything like that. It was in the Economist, was it?
Ana Welch: Yes, it was in the Economist. He passed away last month, I think. So I think that's why they did a little review of his work. But I thought it was just incredible, like this person was sitting there and saying I think and he was quite elderly, I think he was like 96 or something like that he was like AI is going to be fabulous and I wish I was, I could be here to see it. I wish I will be here to see it.
Mark Smith: I totally agree that that, I think, is going to be our future world.
Chris Huntingford: Yes exactly. They say that ASI, so artificial super intelligence, which is stage three, which is the like smaller than humans. I think the date that I heard was 2060. Like smaller than humans. I think the date that I heard was 2060. So hopefully, if I do my push-ups and eat my beans, I'll be around to see that.
Andrew Welch: Well, and we now, we do know that you are at the gym at 5 o'clock in the morning.
Mark Smith: Oh yeah, man.
Chris Huntingford: Yeah, yeah, somebody's got to keep this world alive, bro, you know.
Andrew Welch: Sure, chris Huntingford is the one keeping the world alive, christ's sake yes, i's coming, andrew.
Chris Huntingford: Somebody's got to defend you a lot against the machines.
Mark Smith: That's why I'm gymming I reckon it'll be in the next 10 years I think so too, but you know, it's the comfort blanket, right.
Chris Huntingford: Like, yeah, there was this ridiculous um theory back in the day. It was a conspiracy site called bofm and I can't remember. I don't think it's still alive. So it was black ops and fucking magic theory. Back in the day it was a conspiracy site called BOFM I don't think it's still alive. It was Black Ops and fucking magic. They were talking about what they were doing with all the tech in World War I and World War II and what NASA were doing and things.
Chris Huntingford: I'm not a conspiracy theorist but I like the idea that there is a 10-year progression on tech before we have it and touch it.
Chris Huntingford: It's interesting that you say that because, like, I'll give you the example right, there's a, there's a movie that's called men that stare at goats. That's a really cool movie with george clooney, but they were talking about using, like, all these psychic powers to solve problems. And what's interesting is, like, the correlation is is that years and years ago, when you watch movies like demolition, man and all that, ai is there and we're like, oh my gosh, gosh, it's science fiction that will turn into science fact. Yeah, and even at school we talk about that, like at the University of Birkbeck, when we talk about science fiction, we're like the propensity for this to happen is actually quite high. Now, being in tech and looking at what actually is happening in the sciences and looking at the movements of narrow artificial intelligence through to general artificial intelligence, through to super artificial intelligence, I don't think they're far off, man. I don't think that a lot of this stuff is far off at all and you know chris speaking of speaking of conspiracy theorists and animals.
Andrew Welch: I heard one recently ready for this. Have you heard this? This? I heard that Microsoft MVPs everywhere were turning into T-Rexes.
Chris Huntingford: I heard this as well. It's an epidemic, dude.
Andrew Welch: It's an epidemic, wow.
Mark Smith: Shots fired.
Andrew Welch: Shots fired. Oh my gosh.
Ana Welch: I have a question what possessed you to see a movie that was called Men that Stared Goats? How yeah.
Chris Huntingford: Because they talk about. It's really funny. They were trying to mix psychology, psychic powers and all sorts of things, trying to weaponize it and trying to find, like, ways to fuck with people and um, it's sort of what hitler did with the occult, I guess. But it's interesting because, like I wanted to see it, just because I like psychology and I like the way that people think, why would you want to do that with something?
Chris Huntingford: like I said, we'll turn a walking stick into a weapon any day of the week yeah so I like I don't like the concept of weaponizing things, but I like learning about why people want to do it so I think that this is a, this is a very.
Andrew Welch: I mean, I think people really know where we are coming from, but I'd be curious this thing, artificial intelligence that we're all, for the most part, very optimistic about. What are each of your? What are our fears like? What do we think reasonable, measured fears are? Where is this going in a less positive way?
Mark Smith: my, my fear is that religious nuts will want to take us back to the dark ages. You know, the people that are right out on. I don't know, I don't understand left and right wing, but right out on the edge of that would be that you got.
Andrew Welch: You got the direction right on that yeah, yeah.
Mark Smith: So so you know that kind of will want to go listen, let's take us back 10 000 or a thousand years or or whatever, into the good old days type thing, because they fear it so much. So I don't fear the tech at all. I fear the people's how out of control fear and other people and they'll get a messiah that will start taking them and poking them on that journey and then there'll be a following, and then there'll be following and then there'll be. That's what I. I that would be the only thing.
Ana Welch: I don't fear anything in the technology or whatsoever fear is the most part, probably is the most powerful feeling that someone, that someone can get it goes against our very first instinct, right I?
Mark Smith: don't know if it's the most powerful feeling I've had other powers.
Chris Huntingford: Apparently it is. I call it a telescope.
Mark Smith: A different derivative.
Chris Huntingford: I'm extremely excited about digging into this one. Go for it, Anna.
Mark Smith: Go for it.
Ana Welch: Stop mocking me. There's actually a. I'm not mocking you.
Chris Huntingford: We're talking mock more than anything else mocking me.
Ana Welch: There's actually a. There's actually a good um, maybe you've seen it. There's a. There's a film on netflix on the brexit campaign and how they manipulated people through fear and how they like that was a strategy and how they were able to you know target that fear and you know how it all worked. So maybe it wasn't like the best choice of words here from my side, but definitely can you imagine. Like we can see, right now, people are just afraid that AI is going to take their jobs and AI is going to dissipate their data and AI is going to I don't know control us.
Ana Welch: But the reality is that right now, if we go back to Microsoft technology or the co-pilots, they're really cool, but they create a power app where they write an email.
Chris Huntingford: It's the spellcheck of the AI world right now. It really is. I mean, I'm not trying to be rude, but the tech is fine. But like, look, the tech is fine, but you know when. I don't know if you'll remember, but like, the first example I'll use is on your shitty nokia 3310 at 2 am in the morning trying to message your partner at you know, hey, I love you so much after like 25 beers and typing the word you know, duck, instead of another word.
Andrew Welch: And um, it's interesting because did you just self editedit for the first time? I did. I did, yes me.
Chris Huntingford: But yeah, it's weird because, like, when you're going through that process, like that clickety-click thing, that autocorrect just automatically happened, like we all just figured it out right. And then spellcheck. You didn't click the button in Word anymore, like it just naturally happened. In a year's time we're going to be like oh well, you know, I don't really remember writing Word documents from scratch and I think that that's. I don't think that that's AI. I think that's just Well, it's AI, but it's more.
Mark Smith: Yeah, a lot of it's automation, right as well. It's automating that stuff that we would do manual clicks for in the past. Tell me what are the thoughts on this? Meg and I have been discussing this a bit lately around how you have discussions with c levels around ai that is not so technical that that they glaze over, but is really tangible.
Mark Smith: One of the things I came up with is this idea that, because it's it deals with fears, right, um, and one of the big ones is it's going to take my job, which is, I think, it's like electricity. In the past I've said it's like, you know, when the car replaced the horse in Dre. I don't think it is that. I think it's electricity. Electricity can be extremely dangerous if you, you know, grab the wire wrong or you know. It's a powerful energy that has been tapped and usable and it gives us ice in our refrigerator, it gives us hot water in our shower. It we can't, you know, lights out. We can't imagine life without it, and all these uses for electricity are being created and I feel that that's kind of where we are with AI. It's a new thing, but like, how many jobs has electricity created around the world? An unbelievable amount of jobs, right? Did the wax makers or the oil blubber you know processing plants did they lose their job because nobody needed whale oil to light a lamp anymore? Yeah, sure, a few jobs got lost, but how many more jobs are created because of it?
Mark Smith: And I think that dealing with that kind of worry about it's going to take my job. Yeah, the job that you have today but heck, you know, my job, that I started my career with, was mowing lawns, and I'm quite happy that I'm not in that career anymore. You know that I updated my skills to move into a different area and I'm quite happy that a robot now mows my lawns. More power to it. I'm like did I enjoy mowing lawns? Absolutely. Do I want to do it as a career? No, thank you anymore, and I think that everyone has to learn new skills and keep adapting and evolving. But I think there's going to be more jobs created from AI than necessarily taken away.
Andrew Welch: I think that history, and particularly the history of technological innovation, definitely bears that out, and I think that that is almost certainly likely to be true. But I think that the challenge is that what is true at a macro level can be absolutely life-destroying at an individual level or at a community level. And let's take this away from AI. A while back, when I think we all thought that self, what we all didn't think, but some people wanted us to think that self-driving cars were closer on the horizon than they have proven to be.
Andrew Welch: One of one of the thoughts that I had at that time was that this is going to be an absolute catastrophe for the at least in United States, for the long haul trucking industry. Right, and the reason that that would be such an immense social problem is that driving a drive, being a long haul trucker, is a. It's quite lucrative in a lot of cases. It does not require an advanced technical degree and many, many really hardworking folks with families who rely on them, rely on the income from that particular profession which, like I said, can be quite lucrative profession which, like I said, can be quite lucrative, and you know. So I've intentionally tried to separate this with that analogy from AI, because it's this general pace of technological change that, as the pace of technological change increases, right, this is not just happening in AI. I think that the displacement effect is going to be much stronger than in any previous generation of massive and rapid technological change, and that's going to be a big problem for society in the short run how is it addressable?
Mark Smith: how, how to you know? One thing I've I've always, why I've always been a big proponent of business applications is because you didn't need, you don't need a degree to get into the space and become very good in this space. You don't need to be really well educated, right, you could. You know, we've seen use cases of receptionists. We've seen use cases of hairdressers. We've seen use cases of people that you know mow lawns end up doing really well in this industry. You know, allows that transfer.
Mark Smith: Do you think that it's there's going to be like a transition path for people that you know might have very, let's say, labor intensive or manual skills that are going to be left out in the dark? Or do you see it's like the you know, the first, second tier of the legal industry, legal secretaries, things like that? You know they're going to get wiped out. And so I would have thought, if you're in that skill frame, surely you could upskill. And so I would have thought, if you're in that skill frame, surely you could upskill. But then if you were in a where you're just purely laborer, you know, like, you know roadwork signs, that type of thing, which could easily be replaced by an Android bot in the future? Do they have the wherewithal to upskill into a technical role in the future? What's your thoughts, andrew?
Ana Welch: I mean, I don't know if in a technical role in the future, I what's your thoughts, andrew? I mean I don't know if in a technical role, but we do have machinery. Who that make cars and things and, uh, big pieces of essential. I don't know metal or know things that we engines. Everything is done automatically. Right, that used to be done by hand as well, and somehow people have still, so industrial revolutions have been there before. My only question here would be and I know that Andrew's wanting to to share his opinions how quickly will this happen? Because the difference between this being a true 96% beneficial thing to it, creating a big recession, spans in how long it will take for those folks to get upskilled maybe not in a highly technical job, maybe in something else. You just use the electricity example. Not everyone's an electrician, but some people are electricians. Others measure how strong the electricity is, others simply make the wire. There are many possibilities here as well.
Andrew Welch: Yeah, I think a lot about my great-grandfather who, in relation to this topic, my great-grandfather left our family's village of Grunson, sweden, and he went to America. He went to Boston and he left with a couple of his brothers. Some of the brothers stayed behind and he left because there were no more fish. They were fishermen and the seas in the Skagerrak were out of fish and he left and he never had any hope of going back and he never did. He died far too young in the Boston area and with very strong roots in the country where my people came from, sweden.
Andrew Welch: I think about this a lot because the country where I grew up, the United States, is filled with many, many millions of stories like my grandfather's and like mine, and one of the things that I am troubled to, that it feels to me we have lost, at least in Western society, is a sense of being willing to say there are no fish here. So I'm going to go find something new, and I compare this to my experience. For about 10 years, my parents moved us to West Virginia and West Virginia, for people who don't know, is one of the. You know. It's a beautiful place filled with wonderful people, but it scores very low on almost every measure of prosperity that you can imagine. Right, it is the least educated, one of the least educated, one of the poorest, one of the lowest GDP per capita, one of the most obese, one of the least healthy. So in almost every way, but what a great song.
Mark Smith: What a great song Like any time here, west Virginia right.
Andrew Welch: What a great song that we love to sing at Microsoft karaoke nights. But in any case, I look at that state and the hard times that that state has had and I see people who are very, very unwilling to leave and they're often very unwilling to explore new professional opportunities. Right, because what they want is they want the prosperity of the mid-century to return, and it won't, right? I worry that a combination of a lack of desire to get on the boat and go find a place where there are some fish, combined with the speed at which this technology is evolving and, I think, the speed at which displacement may occur in the next 10 years or so, I worry this is going to be a very difficult combination for society and for the economics of society and the social fabric of society to overcome. That is by far my biggest fear for AI. We'll get through it in the end, but I worry about the pain that society will have to undergo and individuals will have to take on as we navigate this.
Chris Huntingford: We've done this, though. We've done this before. Like this is not a new thing, it's just faster. That's the part that terrifies the crap out of me. And, mark, you asked the question earlier like what's the fear?
Chris Huntingford: The fear for me is the speed, and what I've started to realize is that, you know, I know we've all had an analogy. Right, I'm going to give you an analogy, and it's around farming, and years ago, we used to, you know, water our crops by going to the water, picking up a bucket or something and walking to the crops and chucking water on it. Now we have, like canals and pipes and all sorts of really cool things, and that took a long time to get right. And even like Industrial Revolution and the way that, like, digital transformation started, where we had, like I think it was Capgemini that coined the term in around 2014. Like, we're going to digitally transform, we still had around six years until the era of enablement, seven years Like now. Andrew, I remember you showed me a graph we're talking like months in between hype cycles.
Chris Huntingford: Now, the speed is terrifying and I think sitting on your hands is the problem that we're going to have, because it's become so easy for people to not learn, if that makes sense. I mean, the example I'll give you is from a social economic perspective. Look at the UK. We actually are okay with people not doing anything. We're super fine with it. We're like you know what, you don't have to do anything, we'll just give you a free house and have a lot of babies right Now. That's not everyone. There are some people that actually need more help, but like society has become okay with being lazy, that's what I think.
Mark Smith: Yeah, it's, it's, it's a hard one when you know, I do look at the big picture of it and, andrew, I think highlighting there the in, that's, the individuals that are going to get impacted, and how many, how much larger is that individual group? Um, I remember a movie years ago and I and I just I've never been able to find, I've searched for this quote everywhere. And um, who is the, the us general that I think became the us president after the second world war? Eisenhower, yeah, so I think eisenhower said this, but I've not been able to find the quote. Um, they had the storming of the beaches, maybe normandy, whatever it was, and he was, and you know the way it was being acted there. They said, like man, our, our loss of life was so low, we only lost like 5,000 people. And he turned and he said to that person, telling him and said you've got to realize for their family it's 100% casualty.
Chris Huntingford: Yes, exactly.
Mark Smith: And I'm like, wow, that was so impactful. Yes, when we look at the big broad picture, it wasn't many, but he said for their families, that's a hundred percent the actual impact. And I think, andrew, it summarizes what you're talking about there, which is, for the individuals that that can't move or don't move or whatever. And it reminded me of him and whore from who moved my cheese when you, when you purposely don't see things are changing and you don't proactively move. Someone messaged me yesterday and said I see change of foot after I said that I'd resigned and I said don't wait yeah until the change happens to you.
Mark Smith:
Get moving, get moving. Find your next lot of cheese, andrew. How do you deal with people that don't know how to go find their cheese, or how do we as a society protect those people?
Andrew Welch: So I think, that first, I want us all to keep in mind right we, the four of us here, and probably 98% of the people listening to the show right, we all work in technology and, for the most part, we love the work that we do and we feel that it's important work to be done, and I think we love that we're on the forefront of whatever it is that is coming next. I think we need to not underestimate what pricks we can sound like to people who are not in, you know, people who are not, who are in a perfectly honorable profession, a perfectly necessary and important and arguably many professions that are more important than ours to the actual functioning of society. And and I think that I think that that story that you told about General Eisenhower is important can prognosticate all we want about the brighter future ahead. But to the person or the family who lost a job or who lost a home, that is a 100% loss.
Andrew Welch: So I think that, first and foremost, folks like us who work in this space, we need to really kind of fundamentally reorient our thinking right and remember that we might get to that bright future, but there's going to be loss along the way, one of the challenging. So then I think we start to look at right to help those who are in some way put out, displaced by this technological change, to help them weather it. And in my experience and in my observation, here too we have some significant challenges because many people are not looking for a handout. I think that there are plenty of people in the world who are looking for a handout and who are looking to do the easiest thing possible, but I also think, I just believe that the vast majority of people want to contribute and they want to support their family and they want to support themselves.
Andrew Welch: So whatever is provided to help those folks who find themselves in an unfortunate situation due to whatever it is, whether it's AI, or whether it's autonomous vehicles or whatever it might be, it needs to come to the greatest extent. It can in the form of tools to help yourself. Not I, society or I, the government, am here to take care of you.
Andrew Welch: And that's not because we can't right. The profit and the prosperity and the productivity that I believe will be generated by this era of technological change could very, very well fund the caretaking of large swaths of the population that are displaced by this. But I think we have to acknowledge that large swaths of the population don't want to be cared for. They want to be empowered to care for themselves and for their families.
Chris Huntingford: Yeah, I agree and I don't, and the reason I tell you that is because I categorically know that there are some people who do not want to get off their hands.
Andrew Welch: I acknowledge that fully. Yeah.
Mark Smith: There will always be both, but I think that the ability to earn money has in it a it's more than just the money your identity, your, who, you are independence, empowerment empowerment.
Mark Smith: It's, it's, it's a, it's something that defines who you are really due to for many people. And so I think that you know having a cost of living, etc. Covered or handed out by the government and not being taxed. You're the tech, you know the, the, the ii becomes taxed and therefore creates this living wage scenario. Is has other implications with it, right? If someone, all of a sudden, there's no requirement to work 40 hours a week, um, because everything your money. But then, but if there's no requirement to work 40 hours a week, um, because everything your money.
Chris Huntingford: But then? But if there's no requirement to work 30 hours a week and you get given free money, like why would you work 40 hours a week?
Mark Smith: exactly, but then when you get fulfillment, I would love to just go write my books.
Andrew Welch: I would love to just go write my book honestly, but the thing is that's where we're different.
Mark Smith: Right is that I would never be in a position. I couldn't imagine just because I have so many projects and hobbies and things that I don't get paid for that I'd ever be in a position of not having stuff to do that I enjoy doing.
Ana Welch: But you'll be paid for that, to survive, not to buy that fancy microphone, not to have like pretty lights behind you. You, you won't have, like. I think people I'm with andrew here, actually I think people, people for the, for the majority of individuals do not want to be in that situation and it's not a pleasure to, I don't know, eat I don't know, just pre-made supermarket food and drink cheap beer and and not be able to do what you want to do. So I think, for the most part, people do want to, to thrive through this, but it's still scary because it's very hard.
Chris Huntingford: It's terrifying, can I tell you. I think, without bringing this back to Power Platform, I think that they were.
Andrew Welch: Chris will always bring it back to Power Platform.
Chris Huntingford: I will actually I do love Power Platform, but I don't think that they were far wrong with what they were doing. I actually really liked the message that they put out. You know like I don't know if every street sweeper or person X on the street got a Power Platform job, but it certainly did transform a number of our friends' lives.
Ana Welch: Yeah, 100%, definitely yeah.
Chris Huntingford: Hopefully, and I like when I look at some of our pals, I just think, holy shit, these people are way smarter than me, way smarter, and it's just down to, like, timing and opportunity. I was just lucky because I'm, I was in the right place at the right time, okay, but sometimes people aren't. So I get that, I really do, and I think with this movement, somehow there has to be some mechanism to drive that same ethos at a wider scale.
Mark Smith: I love folks. We're well over time. So let's wrap this up. It definitely went a lot different than what I thought we were going down the path, but it's good. It's good to riff and it's good to get perspective Once again. If you're listening or watching this, we want to hear from you. We'd love your opinion. We don't want everyone to agree with us watching this. We want to hear from you. We'd love your opinion. We'd love you know to. You know, we don't want everyone to agree with us. We want people to go call bullshit on what we say and go you know you're in an ivory tower or whatever. Um challenge us, call us out when, if we've got shit wrong, right because um it's collectively.
Mark Smith: We all learn and grow um from each other, so yeah any which way, any channel, you should be able to reach us somehow. Some way We'll be there and we value your opinion, your insight and, yeah, if there's a guest that you think we should have here because you think this person has some insightful thing to say, let us know also. And with that, anna, why don't you want to wrap things up for us?
Ana Welch: You caught me by surprise. I thought you were wrapping us up right now.
Andrew Welch: That was the wrap.
Ana Welch: That was the wrap. Hey, look, just let us know what do you think about AI? I actually love this episode because you really went into the real world and our day-to-day lives, and thanks for watching us and thanks for being with us yeah, absolutely thanks everyone.
Mark Smith: Hey, thanks for listening. I'm your host business application mvp mark smith, otherwise known as the nz365 guy. If there's a guest you'd like to see on the show, please message me on linkedin. If you want to be a supporter of the show, please check out buymeacoffeecom. Forward slash NZ365guide. Stay safe out there and shoot for the stars.
Andrew Welch is a Microsoft MVP for Business Applications serving as Vice President and Director, Cloud Application Platform practice at HSO. His technical focus is on cloud technology in large global organizations and on adoption, management, governance, and scaled development with Power Platform. He’s the published author of the novel “Field Blends” and the forthcoming novel “Flickan”, co-author of the “Power Platform Adoption Framework”, and writer on topics such as “Power Platform in a Modern Data Platform Architecture”.
Chris Huntingford is a geek and is proud to admit it! He is also a rather large, talkative South African who plays the drums, wears horrendous Hawaiian shirts, and has an affinity for engaging in as many social gatherings as humanly possible because, well… Chris wants to experience as much as possible and connect with as many different people as he can! He is, unapologetically, himself! His zest for interaction and collaboration has led to a fixation on community and an understanding that ANYTHING can be achieved by bringing people together in the right environment.
William Dorrington is the Chief Technology Officer at Kerv Digital. He has been part of the Power Platform community since the platform's release and has evangelized it ever since – through doing this he has also earned the title of Microsoft MVP.
Partner CTO and Senior Cloud Architect with Microsoft, Ana Demeny guide partners in creating their digital and app innovation, data, AI, and automation practices. In this role, she has built technical capabilities around Azure, Power Platform, Dynamics 365, and—most recently—Fabric, which have resulted in multi-million wins for partners in new practice areas. She applies this experience as a frequent speaker at technical conferences across Europe and the United States and as a collaborator with other cloud technology leaders on market-making topics such as enterprise architecture for cloud ecosystems, strategies to integrate business applications and the Azure data platform, and future-ready AI strategies. Most recently, she launched the “Ecosystems” podcast alongside Will Dorrington (CTO @ Kerv Digital), Andrew Welch (CTO @ HSO), Chris Huntingford (Low Code Lead @ ANS), and Mark Smith (Cloud Strategist @ IBM). Before joining Microsoft, she served as the Engineering Lead for strategic programs at Vanquis Bank in London where she led teams driving technical transformation and navigating regulatory challenges across affordability, loans, and open banking domains. Her prior experience includes service as a senior technical consultant and engineer at Hitachi, FelineSoft, and Ipsos, among others.