Crafting a Digital Future with Personal Stories and Advanced AI
Jon Russell
Mike Gowland
FULL SHOW NOTES
https://podcast.nz365guy.com/527
Imagine intertwining the rhythms of personal passion with the beats of cutting-edge technology. That's the melody we play in our latest discussion with Jon Russell and Mike Gowland from A&S Group, where we harmonize the tunes of AI, the Power Platform, and life's candid moments. Our guests don't just bring their tech expertise to the table; they also share slices of their personal worlds, from John's artistic forays into photography and music to Mike's heartfelt family tales, setting the stage for a symphony of stories where technology meets the heart.
'Just Ask It' isn't just a tool; it's a testament to the spirit of innovation that thrives at the intersection of AI and human ingenuity. We wade into the narrative of this game-changing asset for DataVerse schema generation, born from a high-stakes challenge to assist a water utility company. Witness how a blend of technical prowess and AI assistance gave rise to a digital copilot that now empowers consultants to demonstrate DataVerse's versatility across diverse industries, all while fostering a unique dialogue between technology and user needs.
As we peer into the horizon, we ponder the future of personal language models and their transformative potential for businesses and individuals alike. The conversation takes flight as we discuss the impact of shrinking chip sizes on AI development, the allure of personal digital avatars, and the ethical framework that must scaffold the AI landscape. Join us as we explore these trails, guided by John and Mike's insights, and consider how the burgeoning growth of AI technology could shape a future where your tech understands not just your commands, but your story.
RESOURCES MENTIONED:
Just Ask It - https://github.com/sgtsnacks-64/JustAskIt/tree/main
AgileXRM
AgileXRm - The integrated BPM for Microsoft Power Platform
If you want to get in touch with me, you can message me here on Linkedin.
Thanks for listening 🚀 - Mark Smith
00:31 - Power Platform and Personal Hobbies
11:54 - Exploring Practical AI for Data Modeling
20:55 - Dynamic Interfaces and Data Processing Potential
37:00 - The Future of Personal Language Models
Mark Smith: Welcome to the Power Platform Show. Thanks for joining me today. I hope today's guest inspires and educates you on the possibilities of the Microsoft Power Platform. Now let's get on with the show. Today's guests from the United Kingdom. We're going to be discussing AI co-pilot and AI orchestration, as well as some amazing stuff that they are doing in AI. They both work at A&S Group as Power Platform Configuration Consulting and Power Platform Consulting and Generally in general. Sorry, john is a regular blogger. He talks about mental health community and his passion for the Power Platform. Mike has experienced in a variety of programming languages and technologies across low-code environments and has a good working experience with developing automation workflow solutions on the Power Platform. You can find links in the show notes for this episode to their bio, social media etc. As well as if there's any resources like I know they've got video content etc you can go take a look at. We'll put those in the show notes so that you can get a good understanding of the amazing stuff that they have done this year in the space. With that, jon, mike, welcome to the show.
Mike Gowland : Hi Mark.
Mark Smith: Hello, good to have you, gentlemen, on Before. Like always, I like our audience to get to know you. So, john, start with you. Tell us a bit about food, family and fun. What do they mean to you? What do you do when you're not working?
Jon Russell : What do I do when I'm not working? That's a great question. So, family I've got my lovely wife, Katie, who pretty much keeps me on the straight and narrow and well, quite straight and narrow and, yeah, looks after me in many, many different ways really. And then I've got two daughters, Esme, who is seven, and Tilly, who is five, Both quite different characters. Esme is my shadow pretty much, and probably vice versa, although I don't like to admit it, and Tilly is second child and she's completely crackers.
Mark Smith: Awesome, awesome. And Jon, what are you doing for fun?
Jon Russell : Fun weirdly fun actually does probably involve a bit of the power platform. I mean. We'll get into it in a bit, but it's kind of how me and Mike kind of got stuck into the AI side of things but also love photography. Before COVID I was kind of doing that as a bit of a side hustle and it was going quite well. And then COVID here and it was a bit it kind of all died. But yeah, that's my main thing. It's the music as well. I used to DJ a lot a long time ago, played loads of different clubs in London, played in Ibiza, yeah. So music is a big thing for me as well.
Mark Smith: I love it. We're gonna just dwell on this for a second. First of all, what's your hardware when it comes to photography?
Jon Russell : So I have a Canon DSLR 7D Mark II with various lenses, but I've also you know, I hate to admit it, I've also got an iPhone 14, and the camera's probably better now on the phone than it is on the DSLR.
Mark Smith: Isn't it crazy just how far the camera technology. I was up at 11 o'clock the other night because I have a robotic mower and it had gone over a little embankment no damage, but anyhow. I got notified on my app, you know, and so I go out my undies, you know, to go sort it out and the moon is out, it's just a beautiful and I grab my. You know, I've got my, I've got the same phone and I take a photo and it's like a daylight photo and it's 11 o'clock at night, you know, the moon's out, just the landscape looked just magic. The other thing then, dj. So do you have like your own turntables and what do you set up with there?
Jon Russell : I have. I have got like well, I used to I've now got like a it's like a Newmark mixing deck that I plug into my Mac and then I can actually stream through Tidal so I can play pretty much any music I want, which is pretty wicked. So during lockdown I used to do like kind of like garden sessions, where I'd like have a barbecue in the back garden and also play some music at the same time. But yeah, I think my I was just talking about this with my wife I think my DJ'ing days maybe might be over. I've been quite about a New Year's Eve thing possibly, which is local, so we'll see what happens, but yeah.
Mark Smith: And are you more the you know the 60s to today type music, or are you more EDM?
Jon Russell : So yeah, back in back, in, back in like between 2005, 2010, 2011,. Very much predominantly house music, like kind of ran our own parties as well.
Mark Smith: Nice.
Jon Russell : But now I would say my, my taste is much more eclectic and I quite like to dabble with different genres and see which songs kind of mix together quite nicely.
Mark Smith: I love it.
Jon Russell : And I'm always looking at that. Yeah, that's good to you.
Mark Smith: Yeah, very cool, very cool, Mike food family fun.
Mike Gowland : Jon's really shown me up because he's so interesting that I'm not Come on. I've got yeah, got my wife Jess. We've been married for about a year and a half now. Yes, we were originally going to get married in 2020, but that didn't happen, thanks to COVID that it was going to be 2021. That didn't happen, so got to 2022. Finally, it did happen. It was quite stressful, but in the grand scheme of things, like it wasn't actually too bad. But in that time before we got married, we ended up having our first child. So I've got. Jack, he's two and a half. He, I think he's determined to injure himself in any way he can at the moment. Yep, he's discovered climbing. He's discovered jumping. Yeah, he's just, he's a bit of a handful at the moment. But he's great. What do I do for fun? I mean, I'm just really into tech, you know. So, like my, my idea of fun is actually like literally sitting down just in my own thoughts playing about with coding new bit of tech, really got into AI, like John just went down that rabbit hole very recently, and that's something I've bought me and John closer together than we already are, cause I think we are basically brothers at this point. So, yeah, yeah. Fun fact about the wedding venue, mike yeah yeah, yeah, so we we don't live near each other at all, me and John. John probably lives about two hours away from me, but we both got married at the same wedding venue.
Jon Russell : Wow.
Mike Gowland : Obviously a few years apart before we knew each other, right, so no idea. And that wedding venue is sort of like I guess it's like in between, where we both live as well.
Mark Smith: That's incredible.
Mike Gowland : We also. We have very similar looking cats as well, don't we? And we're convinced they're brothers. Yeah, yeah, crazy Mine just eats everything, and it's not. It's not fun having an obese cat.
Mark Smith: It's not fun.
Mike Gowland : It seems like it's fun on the internet when you see them on the phone.
Mark Smith: Yeah, yeah, yeah.
Mike Gowland : They're not. They're not fun at all, because nothing, nothing is safe from that cat eating it. You know.
Mark Smith: That's crazy. That's crazy. I've got two cats but they're very skinny as in, because we live in a rural area and I don't like to feed them too much because they've got to do their job and keep the mice away, so they've got to go earn a bit of their food as well. Interesting you say about wedding venues like that. What surprised me in the UK is how much rigor there is around the legal part of getting married, like from the venue having a sign on it that says this venue is allowed to be used for a marriage ceremony, and then before the wedding, like I was there for my brother-in-law's wedding and, you know, getting on the hit flask and you know. But we couldn't get too fucked up because the person that came along like has to make sure that everybody's sober enough to make their vows, that they're not doing it in an intoxicated state, because that's a breach of yeah, I was quite surprised.
Mike Gowland : Yeah, I hadn't come across it. Yeah, your wedding venue also, where you actually get married has to be under a roof as well. So, if you have an outdoor wedding, it has to be under some sort of like pagoda or even like a little. Yeah, it has to be underneath. I feel like doing like vow renewal or raising up that where it's not like for you registries.
Mark Smith: Yeah, I got married at night, and so we're talking about 10, 11 o'clock at night and my mother was like, is this even legal? Are you allowed to do it? And you know, because there was such a like no one. I'd never, ever been to a night wedding before and it's something that my wife and I, where we chose a venue, was on a cliff top looking out over the ocean and we knew that it was going to be a full moon, so as long as no clouds were there and it was, it was just like that no clouds, full moon, reflection right across the water behind us. It was, yeah, absolutely amazing. But I was surprised. You know that the older generation, once again, didn't think it was illegal because it wasn't done in daylight, where that's every wedding that ever been to their entire life. Okay, let's talk about what you guys have been up to, and I find the subject so interesting in the area of AI, and that is because, you know, we've just had the birthday of chat GPT. It's only one year old as of, I think, a week and a half, maybe two weeks ago, and so much has changed and, of course, added to this change is where we all work with. Microsoft technology is. We've heard so much about co pilots and, of course, if you've been in the Microsoft ecosystem more than five minutes, you know that there's marketing, and then there's reality and there's v one of products and then there's, you know, battle hard and tried and tested three, four year old technology and you know it works. And what I find interesting is that you guys have come up with some very practical use cases and and built something out and I'll unpack that in a second. And the one of the questions I get so much is I don't know where to start with AI. I don't know. Here's another one. In fact, this was on a call literally five minutes ago that I was on with somebody else in the UK, which is I'd never pay for it. I don't see the value of it. And you know I bit my tongue because for me I do. I've paid for chat GPT from the day it was purchasable, so when it was $40 us a month, not the 20 it is now. And you know I challenge myself every day to do a minimum of five prompts. And that is from. You know, using mid journey, which I'm about to move away from, I'm going to go to stable the fusion's tools in the space, to chat, gpt, its evolution and the changes I've just seen in the past year, and the the ability to pre prompt it like so it knows a whole bunch of stuff about me before I even get into whatever my prompt is. And I've bought a bunch of other third party tools. You know that that that build out personas, because I'm always looking for you know how can I become more efficient? And the last thing I was say on this when I was living in London and this is 2018. And I really dedicated 2018 to, to studying, researching five technologies. Power Platform was one of them, but AI was one of them back then, right before for all the craze. And one of the things I really liked about Microsoft is that they always took a view of the world around, what they call practical AI, you know where, where Google is like, oh, we've got the chess master of AI, we've got the you know go and it can beat everything. And actually I've been in the offices in Paddington of of DeepMind, because my wife used to work for Google back then, so I saw all this kind of abstract AI. And then I saw Microsoft with this kind of you know and you know with their cognitive service type products around this practical application. And then that brings us to today. So tell us, what are you guys built what? What is just ask it and what was the how did you get to just ask it? What was that journey to that point? And then let's unpack what just ask it is.
Jon Russell : Yeah, so I'll probably take the first bit. So it started off with one of the solution architects on our team asking for me to create a lab in data verse for domain specific client. So I know there are, you know there are various people on our team. We have various different domain knowledge, experience. I'm kind of more on the HR, payroll side, expenses and voices side, that kind of thing. I kind of understand that process. But the client that came to us, excuse me, asking for that was a water company and this was like Friday. It was like Friday, like half four on a Friday. Yeah. So he gave this solution architect asked me you know, could you, could you think of a domain specific example for a water utility company to build something in data verse? And that's basically the information that I had and I'd kind of been tinkering around a little bit with, with, with AI kind of most of that day, I think, and I kind of thought, well, maybe this could, maybe I could use this to try and come up with an idea that is domain specific. So that's that's when I kind of thought of the idea of giving it a prompt, telling it that, telling the AI system that it is a power platform consultant specializes in data verse, specializes is in entity relationship diagrams and its purpose would be to take a domain specific example and then come up with a data verse schema including tables, columns, relationships and potentially an entity relationship diagram. But that kind of came. That kind of came second, and one of the one of the one of the coolest things about it really was and this is when we, me and Mike started talking about it is that I was playing around with getting the. It was basically a power automate cloud flow that was getting triggered by a certain text inside a team's channel and the. The response came back and it was just one paragraph of you know of text and I thought, oh no, it looks, it looks horrible, like I need to kind of format that. So I started looking in power automate and I was, I was like right, no, maybe I could like format it as a table and all this kind of stuff. I spoke to Mike and Mike goes just just to add another one and say, just ask it, just ask it, add another rule, put the, put the response in HTML format. And that was kind of that was quite a a game changing moment for me because I realized actually, yeah, you can just ask it to do stuff. Obviously, there are implications with that, which we'll probably come on to in more detail, especially after a chat I had with someone today. But yeah, that's kind of where we got to and then it kind of started evolving into a model-driven app. But I think the key to all of this really is that, being a consultant, I have the technical knowledge around date-versus and relationships and entity relationship diagrams, but I don't necessarily have the domain-specific knowledge that the client needs. So I've already demoed this to our presales guys and I've said maybe you want to call with a I don't know, a beer-meat company Sales with the horseshoes. Yeah, yeah, and you don't know that domain, do you Right? So, and they might say to you well, tell us what the benefits of using date-versus are. They can use that prompt, they can use. Just ask it, put that in and a minute later they've got the date-versus schema and they can explain that to the client. So it's really a kind of I thought it as a kind of starting point and a talking point for that initial conversation to build rapport with the client and build trust with the client, and then you expand on that, et cetera, and then the entity relationship diagram stuff kind of happened more and more on Mike's side.
Mike Gowland : So, mike, if you want to talk about that bit, yeah, so I follow up from the conversation with John about actually, you know, just ask it to format it in HTML, that will do it for you. At the time I was sort of going on my own little journey with AI. I was actually looking more. This is, I mean, this is before Copiant Studio was generally available. I was looking at actually, you know, can I get? Can I get it to just give me some JSON? and tell it what the JSON should be and actually, if I give it some context of what I could ask it, it could define the process and then that process could be converted into a JSON array of these different objects and parameters and then can I actually just iterate through that array in Power Automate and say you know what this is going to trigger? A child phone that does this particular action. These are the parameters in the string format. Just split that string up when the flow starts and then take that, ingest that all in, do the action and then rinse and repeat. And I was kind of like I was talking to Chris Huntington about it at the time. I was like this is basically just a copilot, right, but I'm just probably going about it a long way and we've got Copiant Studio coming out when John was showing me just to ask it, which I can't remember what the name of it was originally we were looking at.
Jon Russell : N2 relation. I did really have a name, but I think it was just Model Driven App 1, wasn't it?
Mike Gowland : And we were thinking about how can we actually display that schema? Because at the time it was just like a long text, wasn't it? These are the tables, these are the columns, these are the data types. And they already knew about Mermaidjs. It's like a markup-ish language that allows you to build things like flowcharts, swim lanes, N2 relationship diagrams, that sort of thing. And I was like I wonder if we can just get it to give us that Mermaid schema of that table or all those tables, and map the relationships out that way. So we got it, we just asked it and it did it. So we just fire off a second prompt once we get the first one, because there's a flag. In the way the flay works, it says if it's going to generate an N2 relationship diagram. Do the second prompt and ask it say this is the schema. Can you convert this into a Mermaidjs diagram or the text for a Mermaidjs diagram? We save that into an attribute in the table and then we actually just do some customization on the on the model-driven app form with some JavaScript, so just using the Mermaidjs library, and that you just convert that schema into a SVG graphic and then that's what we display to the end user.
Jon Russell : And that was a proper wow moment really, because I remember Mike had been kind of we kind of got stuck into this on like a Friday evening and then we kind of worked on it together the Saturday and the Sunday, and then I think it was like the Sunday night Mike's like oh yeah, you need to come and have a look at this. Come and have a look at this entity. You know how it's creating the entity relationship diagram. And even for me, I don't think I've, I don't think I'll wipe city happy, were they.
Mike Gowland : No, no, no no.
Jon Russell : But it was, it was a wow moment. And the last time I got a wow moment was watching the prestige. It was a great film and I was like I was really. I was really blown away by that and I was like, you know, because that's the thing, that's the thing about with AI and that's why Copilot is is successful, is because it's it's kind of and I hate to use this term because it's a bit of an enum must term but it's going back to first principles, and first principles are being able to have a conversation with someone and getting what you need, you need from natural language, and that's that's the power with this. And yeah, like you said earlier on, you know it's only the, it's only the very start of what I think is going to happen. You know, there's that whole stuff to do with Project Sophia, which was like really under the radar at night, but actually probably one of the more important parts, and all of that. I think we're only really touching the touching the very surface of what is potentially coming.
Mark Smith: Yeah, yeah, yeah. Just just on the Project Sophia side of things. Just do you want to explain what you understand? That is just because there'd be a bunch of people that won't be aware.
Jon Russell : I don't know a lot about it, but as I could understand it, it's kind of like a layer above fabric that you can basically talk to and interrogate data and get it to display charts and visuals etc. Yeah. So yeah, basically like power BI on acid, really, with a bit of natural language programming, yeah.
Mike Gowland : Yeah, I've seen it described as oh God, I've lost what I was going to say. I've described as like a business application built with AI first, rather than being like, I guess like what we've built, which is a business application with a layer of AI on top it's actually like baked into the entire product.
Mark Smith: I just saw a demo recently of a competing product and it literally built the UI based on the domain that was being discussed with the AI, as well as the context of the person, etc. So, in other words, it might have started off as a chatbot conversation but the AI morph sorry, the interface morphed into like a layout grid of content. That could then be you know as you drill through it like a web interface, but then it continued to evolve based on how the discussion was changing to present the information in the most consumable format for the person engaged with it. And I saw this in the context of you know being aware of Project Sophia and I'm like you know the concept of app automation chatbots, power pages. You know the tooling that we have in the power platform. You know what will they be in five years? Will every interface be actually dynamically constructed? And you imagine an interface that was dynamic construction to present the data in the most optimal format based on whether I had any visual impairment, hearing impairment, tactile, et cetera. But in context of the data I was interrogating, I often see this in even in app design Is a dropdown or a multi-selectless the best way to represent a billion records? Probably not right, I would say, when you're trying to filter or whatever, but it's going to be interesting to see how these interfaces literally become liquid in their ability to evolve and engage.
Jon Russell : So, talking about interfaces, so there's a guy on our team, reese Murphy, who has basically helped us start using an application called Lucid, and we use that for a sprint zero engagement, where we will have three or four sessions with the client, couple of hours each, talking about the risks and the problems that they're facing, the personas that are going to be using the application, their user journey, et cetera. So I started looking into Lucid's API to see whether or not you can create those diagrams automatically, which you can do. So that then got me to think about well, hold on a second. We're already having a conversation with the client, ie from the horse's mouth, other animals are available. That transcript is being the. We're recording that on Teams. The transcript is being recorded. Plug that into a large language model. Then, with Mike's help, orchestrate that to create the Lucid board. I mean, I think there are ways and I don't know how far we are away from it, but I think there are ways to go literally from a conversation to proof of concept with a large language model. I think that's possible. I think that is possible.
Mark Smith: This is amazing, so just ask it is it up on GitHub?
Mike Gowland : Not yet. No, that's sorry. We need to really get our asses in gear and sort out. Don't be, John, we keep talking about it.
Mark Smith: I'm just wondering is that, if you can get other people in the community with specialist skill sets start contributing to this, what this could be. Dare I say it, this could be the future of the XRM toolbox. This could be a whole new way of thinking around. The XRM toolbox is built in a time where it was only dynamics and it's been an amazing tooling. But what you're describing here, it's a phenomenal game changer. Because if I look across my career and I remember I went to Hong Kong some years ago and I had to pitch to the I think was the seventh largest shipping company in the world a concept to manage all their ships around the world, all their ports of call, all the scheduling, that type of thing and guess what I knew about shipping? I know ships go on water cargo, maybe I knew nothing, and so what did I do Back then? I jumped on Google, I search and go OK, and I literally got on a whiteboard and said shipping company and I did really a mind map of all the kind of things. They obviously have ports where the plays are part of what they do. They have shipping containers, they have different ships which have different sizing, they have crew, and I just went and that's how I would mind map out, right, a concept. I chuck it into Vizio back then and that would be the start of my ERD, right, and then I would take any one of those nodes and I would build out everything I could Now go pitch to the customer. I'm not even showing an interface designer or anything, I'm just showing this is how I understand it and they're straight away like wow, and you can do this on your software. And this was, of course, dynamics back in those days. Fast forward, hong Kong Jockey Club deal with thoroughbred racing, horses, right, and gambling and breeding. And the same deal, right, I take a horse at the center, they're going to have vet, they're going to have health, they're going to have jockeys at the center. I would do these diagrams out and stuff and the customer would go wow, you totally understand us. I knew nothing about resources a day before, type thing. Then you tell me about this tool and I'm just like going, oh my gosh, this is a game changer because it allows you to go out, I assume, does it allow you to go down to. You talk about a domain level like water utilities, but it could it go even more specific to the exact water utility and interrogate their web asset. Maybe their directors reports, maybe any filings that they've had to do from a compliance. Could it feed that into the mix around? You're saying having those conversations with pre-sales and things.
Jon Russell : I think that's the area where it's a bit of a boner contention, because it's their data, right, it's their proprietary data. So there are two-.
Mark Smith: In the public domain. In the public domain that's different.
Jon Russell : That's different. Yeah, but if it's their private data, I'm talking about public domain, so I'll give you an example.
Mark Smith: The other day I had a customer in Australia, a government agency that had a I won't say their name, but they had a what's called a royal inquiry into them. In other words, they had been doing some things not too good with public money. Now all I did was I handed their website to AI and I said tell me the top five issues that this organization is facing. And of course, it basically read that Royal Commission report and it came out and it said they have been told to have to address this, this and this. And so when I jump on the call, I don't say I've done any of that right, I'm just like so what are you doing in this area and what are you doing in that area? Like I'm asking these questions which I know a lot of the answers to, but of course they go oh my gosh, you've really thought about us, you've really understood where we're doing. And then I ultimately had my team model that into a solution around stakeholder relationship management. And they're like wow, this is exactly what we're looking for. And I'm like, yeah, it didn't exist a week ago.
Jon Russell : But so like it comes back to this thing where you only know what you know, right. And so what are my major things with all of this? And it's definitely an imposter syndrome thing. I think the stuff, the thing that I, the thing that we've built, and no offense is, it's not hard, it is, I feel like it's not, it's not difficult. I think the issue is that the main thing is and like you said earlier on, mark, about you know, having this solution on GitHub actually the hard work is about the grounding of the large language model and coming up with a consensus that this is how we ground all models when we are using it for a certain application, whether that's consulting or whether that's in childcare or whether that's in, you know, a doctor's surgery or anything like that. That's that's. That is that's the difficult part and that's where a conversation I had today we were talking about, we were talking about diversity and equality and inclusion, and you know you have the. The large language model doesn't know that. So you have to kind of baseline it and ground it with these prompts where you're telling it what diversity is, you're telling it what you know race, ethnicity, you know people who have different kind of impairments, et cetera. You know all of that needs to be factored in and I think that is where that is where the work needs to be done.
Mark Smith: But but is that a challenge only of December 2023? No, and what I'm meaning what I'm meaning is that, let's say, we fast forward one year from today and therefore we're on GTP five, let's say as one of the most predominant language models out there. With I, I think that we will get to the point where things like inclusion and diversity is, ai will define it based on everything it's consumed around these scenarios and it would potentially write a better model a more bias removed, et cetera than what any human, because it'll understand what bias looks like, because it'll understand what racism looks like looks like and therefore it will, if you like, create a self healing process around that to make sure it doesn't creep in. You know, like you know, you've seen this, the sample around where CVs are read and they're like how come we only employ males in the organization? Because historically, all the CV daddy have given it I've been male centric CVs and it would go hang on a second. I understand how the bias is creptin. Now I'm going to put counter measures in place against such bias.
Mike Gowland : I mean one thing to to note is that prompt engineering is an iterative process, isn't it? It's not a, it's not a one time thing that I've written the perfect prompt. I can now leave it alone. It's something you have to go back and review and that's something we're actually looking at putting into. Just make it, aren't we, john, where we actually can have feedback on prompts and actually give feedback and I can help a prompt engineer to then reiterate what that prompt actually does.
Mark Smith: Yeah, it's interesting you say that, because what I've noticed in mid journey is mid journey's versions have changed. You can take a prompt that you used nine months ago and just word for word put that prompt in and the output is just lightened day difference right yeah. And and, and. So you're so right a prompt that might have worked six weeks ago, If the models have been updated, might not be as relevant or or accurate today.
Mike Gowland : And also like with with Genitive AI, it's never the same answer twice as it. If you start to introduce things like in the case temperature Nicholas, something like you, don't get the same answer every single time, and the able to use chat GPT will note that. But if you ask, like how to make it normally, you might actually get a few different recipes if you ask it over and over again. So actually, prompt engineer isn't just about when the model updates, it's also, you know, as the model is in its current state you might need to review that prompt over and over again.
Mark Smith: Yeah, the grounding discussion and and you're feeling around get how, how, how you proposing to address?
Jon Russell : that. So I think, I think Something that's come to light today really is so an? S. We have a lot of EDI groups. We have the squirrel club, which is what a what is?
Mark Smith: what is EDI stand for?
Jon Russell : equality. Equality, diversity and inclusion. So I've not heard that before we have the squirrel group, which is for people who are Neurospecine, neurodiverse. We have the parent and child group. We have the mental health first aid is group, which I lead various different groups, and today Mike showed us on a demo like a kind of prompts catalog, and we were talking about that and realized that Mike had kind of split that out by department, so maybe you might have slightly different rules set for People in the IT department, people in HR and actually, yes, actually it's, it's, it's bigger than that. Yeah, it's not really department based, it's domain specific based. And so what we've been thinking is and I said this today on our we had a, we had a EDI meeting this afternoon with like the lead on the lead in our people team, and I was talking about this and I said, actually what we need to do is we need to involve all of these groups To provide the grounding rules to the large language model. So, yeah, that, that, that is that I could. I. That's the. There's a big shift, I think, now from people who might not think they are necessarily tech savvy. We don't actually need those people that are tech savvy because we've kind of what that bit out. We now need the domain, specific expertise that me and Mike might not have or other people might not have, that are tech based. We would like to speak to someone who knows more about dyslexia, for example. You know what? What rules, what rules do you want to put in the, put into the model, to ground it? Which which would improve the response for someone who is dyslexic? That, that, that kind of thing. That's that's kind of where we are with that, and I think that's gonna be. That's gonna be very powerful. I Also think. I also think there is some not so much at the moment, but you know we know how technology goes. We know is that you know more's law of. You know Mm-hmm capacitors and transistors getting smaller and smaller and smaller, which means a Smaller chip with more more data to more data inside it, or more data it can handle. And I think excuse me I think there will come a point where we will be able to build our own large language model. Now, I don't think that's very, I don't think that's close, I don't think that's near, because of the cost of it and there's lots of other, there's lots of other considerations, but I think that is gonna be where the gold is. You know you you could have a consultancy. Speaking to a client wants to use AI. They're a little bit protective about their data. Fair enough, pii data, whatever. Okay, yeah, let's, let's help you spin up your own large language model, trained on your proprietary data and Yep, secured inside the Microsoft ecosystem within your region. That's, that's, that's gold level, I think, and it's I don't think. I don't think it's close because of you know the, the GPU issues that you're gonna have with. You know You've only got like arm and Nvidia maybe that are producing those and Kind of sustainability aspects about. You know how much it actually costs to run An AI dense, an AI data center, for example, but I think in a couple of years time, maybe that's that's where it's gonna go. I think that's where it's gonna go.
Mark Smith: So the CEO of stable diffusion right, he's UK based. I don't know if you've looked at any of the stuff here. I saw a Presentation he did three months ago and he said within 12 months so therefore 2024 that everybody will have a large language model on the iOS platform. Well, in other words, the latest version of an LLM will be able to commit dense to under two gigs in size and Everybody will have a personal LLM on their device that you then control all your private data against. And he saw some kind of concept of you would have kind of like an API into that, but you would control anybody as in what you would give out of your personal LLM on your device. Therefore, the whole advertisers etc. Wouldn't be able to get that. That's why I think my guess is Apple is working on something that's going to, because they've been very quiet about AI, right.
Jon Russell : Oh yeah, they're super quiet yeah.
Mark Smith: They are. Probably. It wouldn't surprise me put it that way If they all of a sudden came out and said listen, we've got our hardware etc. In whatever version of phone in the next period that comes out now can run that full GPU type capacity, but on your own device. Therefore, I've been storing for over 10 years all my email, all my receipts. I've got gigs of stuff, photos and my electronic footprint waiting for this day. Now I am going to be able to go have it all but not owned by anybody out in the public arena that's going to make money off that and go. Let's talk and let's understand Mark as a digital version of me that understands that 15 years ago I went to a doctor and I had this medical treatment and now I've got a medical file that's my medical file of stuff that I've totally forgotten about. What was my last eye check at when I went to get my x-rays from the dentist? That's fitted in there. And now I've got this model of me and stuff around me that I've collected over my life and it operates on a LLM. That's my personal LLM. So I think you're right. I think that that is going to be I don't know if it's going to be two years away. I think it might be less.
Jon Russell : Yeah, I just wanted to say what you said about Apple there being quiet. I kind of respect them that they have been, because what's really pissed me off this week especially because I'm trying to think about AI in an ethical way and the whole triple H thing honest, not harmful. And what was the last one? Honest, not harmful, and I can't remember what the other one is. It's harmless.
Mark Smith: No hallucination.
Jon Russell : No, yeah, but Google, the stuff that Google put out, I'm sorry. Like as soon as I saw it I was like, I was like this is this is too good to be true. And then there was that check on Charles School where they said that it's fake. Google tried to get them fake out and actually no, it's not. And you just stuff like that just really annoys me, because you've got like people who, people who are quite skeptical about AI.
Mark Smith: I'm so wrong.
Jon Russell : Yeah, you've got people who are really skeptical about AI and don't want to use it, think it's going to take over everything. And then you have a company like Google, who are, you know, top two, top three companies in the world, and it's bullshit. Yeah, that's just. It's wrong, it's really wrong.
Mark Smith: Totally, totally. And you know, when I first saw those videos that they had produced, I was like, in fact, what happened is that I listened to a podcast called the Alderland podcast and it's I think it's four billionaires that chat about stuff. And they were like this is the open AI killer. And they had the link to that video. And then I'm like, ok, so that's an unresearched knee jerk flick, because of course, that TechCrunch article the the whole quite clearly that it's a concept. It wasn't, but it was not pitched as a concept.
Jon Russell : right, it was pitched as this is where we are with Gemini right now, I mean, it was basically clickbait, really that's what it was. It was clickbait, pure and simple. That's what it was.
Mark Smith: Yeah, yeah, 100 percent and it was interesting, was done on the birthday of chat GPT, right as in within that week, and it was kind of like you're obviously getting the limelight and yeah, it's a bit frustrating You're exactly right because of the distrust that then creates the less than honest look of the world. Guys, this is one of the longer podcasts I have done, having the two of you on or at 45 plus minutes in any resources that people can look at. Go see in any final words that you want to want to add before we close out.
Mike Gowland : So I did do a blog post it was a few weeks ago now about writing your first prompt and doing it a power platform, which actually goes into how to set up an open AI developer account, how to get your key token, use a connector, build it into a Canvas app. I do need to follow that up and do it with a Power Automate flow as well. It's a good if you've not got into it yet. It's a way to just go step by step understand how to write a prompt, what you need to include in it, how to actually interface that with Power Platform, specifically Canvas apps in this case. But it's a connector that also works over Power Automate. It's just like a little starter really to get into it and understand what the parameters do. It's generally maybe a good starting point. We've also got our video API solution where you can actually see just ask it in action and GitHub coming soon. And GitHub coming soon Awesome.
Mark Smith: I'll get you guys to flick me those links. Just one question I have actually you highlighted it getting the key from Azure OpenAI service. Is the OpenAI version 4 model in there, or is it just still 3.5?
Mike Gowland : In Azure OpenAI they've got GPT4.
Jon Russell : They have yeah.
Mark Smith: Cool, yeah, yeah, I just couldn't remember if I'd seen that.
Jon Russell : Yeah, it's GPT4 1106. And that's something I kind of learned as well is that 1106 is the day and month that that model came out. So 6th of November, which was the OpenAI Dev Day when they released GPT4.
Mark Smith: Yeah, Gotcha, gotcha, fantastic. Well, junimint, it's been fantastic to have you on. I would love to repeat this in 12 months time, yeah, and just see where things are. I think it would be awesome to see this on GitHub. I'd love to learn more about it, but thank you for sharing your story.
Jon Russell : Thank you. Thanks for your time, Mark hey, thanks for listening.
Mark Smith: I'm your host business application MVP Mark Smith, otherwise known as the NZ365 guy. If there's a guest you'd like to see on the show, please message me on LinkedIn. If you want to be a supporter of the show, please check out buymeocoffeecom forward slash. Nz365 guy. Stay safe out there and shoot for the stars.
Jon Russell is a Power Automate geek and lover of all things automation. I am currently employed as a Power Platform Consultant where I am realizing my dream of working inside the Power Platform space as well as contributing to the community and building relationships with the people inside it.
Mike Gowland, a Power Platform Consultant with an unhealthy obsession with Crabs. A tech enthusiast for as long as I could walk. I'm passionate about all things tech. I've long had a passion for development and have finally made this a reality, specializing in Microsoft Power Platform.