Dynamics Corner

Episode 322: In the Dynamics Corner Chair: Business Central AI Hackathon

May 30, 2024 Dmitry, Stefano, and Jeremy Season 3 Episode 322

Send us a text

Unlock the future of AI in ERP systems with our latest episode of Dynamics Corner, where we promise to deliver groundbreaking insights into the integration of artificial intelligence within Microsoft Dynamics 365 Business Central. Join us as we reminisce with our special guests, Jeremy Viska, Dimitri Katzen, and Stefano D'Ameliano, about the exhilarating AI for Microsoft Dynamics 365 Business Central Hackathon hosted by Microsoft in February 2024. From exploring AI vector semantic search to sharing a love for F1 telemetry and Ferrari customizations, our guests bring their diverse expertise and infectious passion for innovation to the table.

Discover the riveting journey of a dynamic team that set out to revolutionize the sales line copilot during the hackathon. By harnessing semantic search and advanced AI technology, they crafted a powerful sales copilot, seamlessly integrated into Azure SQL, and even developed a Power App for natural language ordering. This episode delves into their collaborative process, the technical hurdles they overcame, and the creative strategies they employed to succeed, offering you a treasure trove of practical insights and inspiration.

But that's not all; we also dissect the nuts and bolts of efficient team communication and collaboration, highlighting the pivotal role of GitHub for code management. You'll gain a deep understanding of how semantic search and vector calculations can elevate search functionalities in Business Central, with practical examples and expert advice from Stefano on SQL optimization. Plus, we explore the myriad ways to engage with AI within the Power Platform, ensuring that whether you're an AI enthusiast or a seasoned Dynamics 365 professional, you'll leave this episode brimming with new ideas and knowledge. Tune in and transform your approach to AI in Business Central!

#MSDyn365BC #BusinessCentral #BC #DynamicsCorner

Follow Kris and Brad for more content:
https://matalino.io/bio
https://bprendergast.bio.link/

YouTube:
https://www.youtube.com/channel/UCiC0ZMYcrfBCUIicN1DwbJQ

Website:
https://dynamicscorner.com

Our equipment:

Disclaimer: This podcast episode may contain affiliate links, which means we may receive a small commission, at no cost to you, if you make a purchase through a link. This helps and support our podcast.

Speaker 1:

Welcome everyone to another episode of Dynamics Corner, the podcast where we dive deep into all things Microsoft Dynamics, whether you're a seasoned expert or just starting your journey into the world of Dynamics 365. Innovative strategies, hackathon and a peek into the future with AI integration. I'm your co-host, chris.

Speaker 2:

And this is Brad. This episode was recorded on February 22nd 2024. Chris, Chris, Chris, I learned something today, Me too, man, I didn't know this existed.

Speaker 1:

I learned a lot.

Speaker 2:

I finally understand this whole AI vector semantic search and I'm happy we had this conversation and I'm happy we had this conversation. Back in February, microsoft hosted the AI for Microsoft Dynamics 365 Business Central Hackathon and with us today we had the opportunity to speak with Jeremy Viska, dimitri Katzen and Stefano D'Ameliano about their contribution to the hackathon good morning everyone, good morning morning. Good afternoon.

Speaker 3:

Hello, good afternoon.

Speaker 2:

And also good future, Because Dimitri is with us from the future.

Speaker 1:

From tomorrow.

Speaker 5:

So it's evening, it's evening, it's a warm evening, oh, warm so it's not tomorrow, it's evening, it's a warm evening.

Speaker 2:

Yes, oh warm. So it's not tomorrow, it's today, but in the evening yep yes I show you in your evening yes, yes, I get all this time stuff mixed up um I guess this is night it feels night time yeah, it feels it's dark.

Speaker 1:

It's still dark out there, right?

Speaker 5:

now exactly, it's already dark here, so we are on the dark side yeah, so. Stefano and Brett will be on the light side today. Yes, exactly, be on the light side today.

Speaker 2:

Yes exactly. Two on the dark side, and I think well, jeremy will be in the light side too. So you know, but um, forza, forza, forza.

Speaker 1:

Forza.

Speaker 2:

Sorry, I said Forza Ferrari. Ah sorry, yes, forza Sorry.

Speaker 3:

I said Forza Ferrari, ah sorry, yes, forza Ferrari, but I'm not a huge fan of car sports. So, yes, ferrari for Italy's like pizza, but I'm not a big follower of Formula 1 and stuff like that.

Speaker 5:

You don't have so much money yet.

Speaker 3:

Exactly. I'm working for that.

Speaker 2:

You don't have to have a lot of money to watch.

Speaker 1:

To watch. No.

Speaker 2:

To go or to race. You have to have a lot of money to watch. To watch, no To go or to race you have to have a lot of money, exactly. And I didn't realize how much money was in that sport.

Speaker 3:

To drive you need a lot of money.

Speaker 2:

The drivers make a lot of money. The vehicles are millions of US dollars I think it's $16 million per and then what they put on for the tracks is crazy. I don't know if I'll like Ferrari next year, with them changing up the team, but we'll see. Right now this year, I like Ferrari and McLaren. Those are my two favorites.

Speaker 3:

But, jeremy, welcome. We worked with Ferrari, I think seven, eight years ago. It was our customer for some parts and I have a live meeting with them and I saw how they create the internals of their uh their cars. It's absolutely incredible. So you can choose. Also when you buy a ferrari for yourself, if you have the money, you can choose the type of uh also the type of uh internals you want, uh, the colors, the type of way how internals are created. So you can choose every minimal types of details in the internals. It's incredible. They had more than 100 types of internal colors and stuff that you can choose from in order to customize your car. So it's incredible.

Speaker 2:

Those cars are incredible.

Speaker 5:

I think if Microsoft would like to talk about Business Central performance during these conferences, they need to have Ferrari as a customer.

Speaker 2:

I was going to mention that telemetry. I never really understood telemetry as much as I did with what they do with F1 racing, because all of the tracking that they do on those vehicles is incredible if you take a look at all of the telemetry and the monitoring that they do.

Speaker 2:

With the slight adjustments because you're talking fractions of a second performance gains it is for them and, um, I was fortunate enough to go to the f1 race in miami recently and you never appreciate how fast those vehicles are moving, until you see one zoom, you, you, that's just incredible. It's, um, it was absolutely incredible. But, uh, dimitri, stefano jeremy, thank you for taking the time to speak with us this morning. I have a lot of questions for you all and I'm happy that you are all here, because it's a topic that interests me and I was able to follow from afar, not as close, intimately as you. But before we get into the conversation, if you would, would each of you tell us a little bit about yourself, dimitri, yeah, hi everyone.

Speaker 5:

I'm Dimitri. I'm MVP for Business Central and a central QA creator. I'm a big fan of AI in Business Central and a central QA creator and a big fan of AI in Business Central and in general. So, yeah, I would love to talk about our AI hackathon, that we hacked Our team hacked this February.

Speaker 3:

Stefano. My name is Stefano, from Italy. I'm an MVP on Business Central and on Azure. I work from Business Central for a very lot of years and from an AV in the past. My work is actually mainly divided between Business Center and Azure Staffs, so I also, like Dimitri, I'm quite passionate about AI in general. I have done in the past, and actually currently doing, some projects in AI. I'm happy that Dimitri was the creator of this hackathon idea and with Jeremy and Dimitri I think we have done quite nice scenarios that I hope that in the future will be also embedded into the standard product. Excellent.

Speaker 2:

One question that I have that in the future will be also embedded into the standard product. Excellent. One question that I have to ask Did you have pineapple pizza yet with Giulio?

Speaker 5:

Yes, you did.

Speaker 2:

Yes, and did you like the pineapple pizza?

Speaker 3:

No, not too much. Not too much. I honestly prefer the Italian way. But, yes, we have done the experiment. I think that we will propose that in some next events, but I can confirm that the Italian style is better.

Speaker 2:

Excellent, excellent. We'll have to follow up and have a conversation with the two of you on that. Yeah, uh, jeremy hey, uh, another.

Speaker 4:

Just rounding out the gang of bc mvps on the call, since now you two are as well as christopher and brad, so there's five of us in the room, so a good hand in poker. Uh, been've been doing things for BC for a very long time and very much a generalist in lots of areas. I love to dive deep briefly into each thing and find all the different ways that things can be leveraged and used and brought together, mixing and matching different things, or mixing and matching different things. So it made me laugh watching you guys kind of talk pizza shenanigans from afar, because Swedish pizza shops have the same attitude of mix and match and see what works. So you know the horrors that people fight over the pineapple is. I would love to introduce them to some of the ones I love and some of the ones I fear here. Like they put shawarma on pizza, so it's a nice Turkish meat, very good. But also you can get pizzas here with bananas and peanuts and things like that. So it's a whole different culinary world.

Speaker 2:

That takes it to a whole new level.

Speaker 4:

Bananas and peanuts Shrimp tuna.

Speaker 3:

Bananas and peas.

Speaker 2:

I think we have to stop there, because now I'm afraid.

Speaker 5:

Yeah, I blame myself. I ate the pizza with the apples just three days ago, oh wow.

Speaker 1:

That's wild.

Speaker 3:

You can write us where you will be in Venice.

Speaker 5:

Exactly. But yeah, don't unfollow me. Yeah, it was just one.

Speaker 2:

I think just the conversation with the thought of apples, bananas and peanuts on pizza just totally threw off my focus for a few moments, but, as you had alluded to in the introductions and back in February February 20th to February 23rd Microsoft hosted to explore the realm of AI within Business Central, the AI hackathon for Microsoft Dynamics 365 Business Central, and the three of you were on a team I guess, if you call it Is that the official word the team that submitted a contribution to this hackathon. I was able to watch from afar afterwards as you had discussions about your contribution or submission, and I wanted to speak with you about it, to learn more about your contribution as well as several other questions here about that. So, before we start, would one of you tell us what is a hackathon in general, just in a general basis, and then we can branch into what was the hackathon for Microsoft Dynamics 365 Business Central. Not everybody at once.

Speaker 5:

So I, frankly speaking, got email from Microsoft team in the beginning of December.

Speaker 5:

So it was their idea to organize this event, which they called hackathon. The idea was that they want they just introduced to the copilot toolkit in the business central two months ago in directions EMEA and they wanted to, you know, spread the world and show how different people, different partners, can use it to create something cool Not something, maybe will be used in the product, but you know but some ideas. What can we do with that, what scenarios it can cover? So I got this email in the beginning of December and they asked me about how I see this hackathon should be going, how should they prepare materials and so on. So I just gave the advice and that's it.

Speaker 5:

So they organized everything by themselves. So maybe after I will describe about the idea and how we got to the idea. But in general the hackathon was intended for someone who was not involved into the AI world at all. So it's something that any partner who was not building before any AI solutions can jump into this event, make it, you know, join into a team or create his own team and invent something, something cool that will work inside of Business Central with the help of AI.

Speaker 2:

Okay, great. So with that hackathon and the hackathon was the Microsoft's intent for teams to get together and create something with AI within Business Central, to explore or come up with ideas for how you could use AI within Business Central, and so the three of you had formed a team. How did you come up with the team? How did you form the team? The three of you are geographically dispersed. I know Jeremy and Stefano, you are a bit closer in time, and then Dimitri, you are several hours ahead of them. So to have a dispersed team like that is not uncommon in the world today, but it's also unique. So how did you come up with the team?

Speaker 5:

I would start about how we come up with the idea. So, because first was the idea and then it was a team.

Speaker 2:

Okay, so how did you come up with the idea?

Speaker 5:

Yeah, so I think it's not NDA to say that we have internal meetings with Microsoft as MVPs right, I think it's not NDA. And we discuss future things that Microsoft is working on and they gather ideas from the MVPs and we have some, you know discussions. So we had an internal meeting with Microsoft and they showed us the sales line, copilot I think it's called like this.

Speaker 2:

Yes, the copilot sales line was released in 2024 wave one.

Speaker 5:

Yes. So it was released this way and we discussed this, like some months before, and we they presented the way, how they did that and we had some hot discussions about that and the way, about their approach. So I proposed one idea, so we talked about that. They stayed with their approach, but we decided to prove to ourselves that the way, how we see that, is possible. So it was a challenge for ourselves. First of all, because we got this idea that, like when you ask for create sales lines and internally they for create sales lines and internally they, internally they get the intent from the user and then search for these items in their database, but they search it in a like classical way, like a keyword search, and it's not always a best way. That's the bay, this, the way that we have available, so they use that, but we, what we proposed, is like a semantic search. So why wouldn't we use embedding selectors and embedding search for that? So we didn't know if it's really possible in Business Central. We never tried. So we decided to try and use this idea for the BC Hackathon. So that was the idea I work, I I've got a chance to have a session before with a Stefano and with a Jeremy in different conferences and I know that Stefano is very good in Azure, you and Jeremy is very good in generated ideas how to optimize business processes and I got this just a platform idea.

Speaker 5:

So we decided to gather together and see how we can take my platform idea of semantic search, really challenge ourselves and simulate. How should Microsoft do this in the Azure SQL? So like simulate platform support. And then we asked Jeremy, okay, we have this cool thing, where can we use that? So Jeremy generated many ideas how can we use that from the business process point of view? So Jeremy proposed to create a sales copilot that we actually built as well. So we built a semantic search. We implemented this in the platform, in the Azure SQL, and we also created a sales copilot. We also exposed it as an API and Jeremy created a Power App that allows any external user to order anything in a Business Central just using its natural language, and it was very good in creating the sales orders with the items that really exist.

Speaker 4:

And I'll not lie, I was a little nervous joining these two. Your technical skill levels with Azure and AI were well beyond mine and very sophisticated, so I was a little nervous being invited to the team. So, brad and Christopher, I was probably more like you going. I don't even understand how this under the hood works Because you know, the minute I get into the first meeting they're talking about doing. You know calculus and trigonometry and I'm going. How does that apply to search?

Speaker 4:

So, it was definitely a learning curve, for sure, but the you know, as Dimitri said, one of the things we were thinking about is use case scenarios Like I like what Microsoft is doing. We were thinking about is use case scenarios Like I, like what Microsoft is doing. I don't know folks who have this experience, but my wife is very AI skeptic. She is going what are you doing? Putting Copilot in your accounting systems? Are you mad? You know it's generating stuff that's crazy.

Speaker 4:

But if you think about what AI is for users, it's just a new kind of interface between the computer and the person. Once upon a time you would describe moving your hand, moving a virtual thing, as magic and weird. So AI is just a new way of talking to business central. Copilot is just a new means for it to understand what your linguistic gestures mean. So, by the very complicated things that I will stay out of, the how did we do it? By going into all of those very complicated areas of building that up, it meant that it was a new interface that you could just describe naturally what you wanted. And being able to describe naturally what you want means you can do things like hook up power virtual agents to have chatbots that are accessing that interface, that language processing, which means you can expose those contacts and connections to an API and allow users, via virtual chat, to talk to a sales order or to harvest that information out of an email without having to interpolate it over.

Speaker 4:

You know, have a pre-filled table it. You know? Our demonstration example covered things like if you were using clothing, you know I want something warm and green. Well, look through the database and go, okay, well, we've got a sweatshirt and it has an item variant that's green. So that's probably what you meant. And you can't do that with keyword searching. So the wizards in the room here figured out how do we make the AI understand? Well, this roughly means this. So now I can interpret that for BC, and that was a sight to behold.

Speaker 2:

Wow, that's incredible. And you were right at the very beginning, when dimitri started explaining this. I was going, I was starting to scratch my head going what?

Speaker 2:

so I understand the ai, but the under the hood background of it is, um, you know, I'm still learning. I guess you could say so. Your idea for the hackathon was to have and I'm simplifying it because I'd like to learn a little bit more about it was to have a sales order entry process that was accessible internally and externally using natural language, just as if somebody were to call up, as you had mentioned, a virtual agent, or call up a customer service or sales representative and place an order over the telephone or ask what they were looking for. That is impressive.

Speaker 5:

I would say that initial goal for us here was to implement the new technology in a business central which is called semantic search, and the sales copilot was an example of usage of this technology.

Speaker 5:

But in general, like semantic search will revolution the way how you do search in business central. Like it is really fast and you can search like uh, not with a keyword but with the uh, whole semantics in. Like we showed in another hackathon that if you have a like list of items, like table and maybe like something from the furniture you mean, and if you, if you search for the furniture in classic search you will not find anything right because you have different items meaning furniture, but with a not exact word in that, in the name or description. But semantic search allows you to search for the furniture just like that. You just search furniture and you get everything that belongs to that. You can search for like black furniture or dark furniture and you will see like black tables. And the technology is really amazing and it's crazy how it's actually easy to implement from a mathematical point of view. I was blown away how easy it is to do that.

Speaker 2:

I have to chuckle because it sounds so easy just to put together a search that says dark furniture and it will go through all of my items in my item table and pull up the tables.

Speaker 4:

Yeah, we can discuss this later.

Speaker 2:

Yes, no, I'm interested in discussing that and hearing it. So so the hackathon was the opportunity for you to come up with an idea and then create that idea and then submit it. Uh, you selected the team. I think you have a very strong team, as you had mentioned. Stefano's great with Azure performance, among other things. Jeremy's also great with, as you mentioned, the business processes as well as other business central development performances, as well as you with, I know you with your AI, with CentralQ and from the platform point of view. So you have a strong team that each of you had a specific role to contribute for this project. Yep, how did you? With everybody being in different countries and different time zones, how did you manage the project? And then, also, you worked on it. You had only could work on it from the 20th to the 23rd. Did you work for three days straight? How did you manage working through this project?

Speaker 5:

So we met actually twice. That's it. Yeah, so we met with the meetings only twice. Okay, three times. The first one to have initial conversations.

Speaker 3:

On the topics.

Speaker 5:

Yeah, so we agreed that we will do that together and I described this idea and described the role of everyone, and then we just had separate work and communication through the email. Yeah, we had a WhatsApp group as well, just for very quick conversations, but you know, everyone in this team was doing a separate job, doing a separate job, um, and we you know everyone was really independent in this uh, what what he was doing, doing uh, we gathered and a second time when everyone uh presented uh the result of each job, of each one, and then we managed the process, how we will join everything together in in one, in one puzzle. Yeah, and then we actually, third time, we get it to record the pitch.

Speaker 2:

That's great. See that right there shows how teams can work together remotely across time zones where you don't need a lot of meetings to get things done. In my opinion, sometimes meetings can slow you down, whereas you have the meetings when they're necessary. They're effective, you know not just to talk and then you can all do your work or your your tasks. Someone can bring them together. You can all bring them together to meet, as you had mentioned, to record it. Uh, so that's a good understanding of how you work together. Where did you manage all this? Did you have you had mentioned you use the WhatsApp group for communication as far as the management of the code that you were writing or whatever you were setting up? How did you manage all of that? Did you have a GitHub repo or did you have some other means of managing the systems of?

Speaker 5:

managing the systems and the yeah, we had one GitHub repo for that, but we worked in different branches, but anyway, we saw the result of everyone. Yeah, so it was not very difficult to manage that, because we in this team, everyone is a professional and as everyone understood its role, and you know what inputs, what output should be, so it was. I don't say that we ever managed this somehow, we just were collaborating very easily Because each of us were working kind of independently of each other.

Speaker 4:

A lot of times, when you're working closely together in a tight timetable, you have to work very, very closely together. So in our initial meeting we instead took the approach of knowing that we needed to work separately. Everyone kind of said this is where my part of it ends and this is what I'll need to give you, or this is what I'll need from you, and that way, you know, I talk all the time about APIs. It's an obsession, apparently, but that was kind of our internal API amongst ourselves of where does the responsibility end of what do you need from me as an output? You know there are great little competitions out there where people set up like marble races and each person has like a segment of the race and then you end up with a warehouse full of this beautiful chaos and the way they do that is everyone agrees to have their part and what the agreed input and output is. So that was a key part to this.

Speaker 1:

It just shows that you guys are professionals, been in this space for a long time To work like that. That's awesome.

Speaker 3:

I think, jack, we have used efficiently the communication between us, so not strictly having fixed meetings or something like that, but we were quite flexible on assigning roles. And then obviously we have exchanged emails or something like that in order to have maybe much more information about each other's works. But I think it was a nice experience.

Speaker 2:

It shows a lot and you are professionals, as Jeremy you were talking about. It's what I like to say. Everyone stays in their lane, so you have a role and a responsibility. You know what, like you said, your input and your output is. You're not really concerned with what goes on in that black box, as they say. It's just I know I need to have this, give this over to someone else. They will process, do whatever they need to do and then, with the output, pass it along to where it needs to go. So, stefano, you had worked with the Azure side of this, correct? Is that what I heard?

Speaker 3:

Yes, mainly because, as Dimitri said before, our idea in the hackathon was not to do something that is in the product, but we do something that is not present in the product. The product now has a lot of features like the sales order in sales line search or the search inside the business central data. That is based on full text search, and full text search has lots of limitations. So, like Dimitri said, one of the limitations is that you need to search for keywords that are in the data, not for semantic meaning of the search, and the other is performances. Full-text search scans the entire database, so performances, if you implement full-text, is not so powerful. So this was one of the limitations that we signaled to Microsoft also, when they start with the idea to implement, for example, the sales order creation, sales order creation is actually limited by this.

Speaker 3:

So we want to show Microsoft in this hackathon two things. First things, as explained by Dimitri, we want to show you that if you implement semantic search, you have better results because you can search for the meaning of the words, not only for the keyword. And this was the first goal. The second goal that we want to show and this was my responsibility in this project was to Also to show Microsoft that if you do something, not only in the IL level, but also if you do something under the hood, so in the platform, in the database, performances of the semantic search will fly.

Speaker 3:

So my task on this project was maybe working on that, maybe working on that. So, starting from a solution that dimitri implemented in fully ian code in business central, moving this part, at least part of this into the platform, so into the azure sql database, uh, in order to implement a vector search similarities inside the business center, the, the azure sql database, and uh, we can talk if you want more details about that, but uh, just just just. The result of that was that we was able to show that the full semantic search implemented in the platform level was extremely fast, faster than what is implemented in the AI language.

Speaker 5:

Brett, can I show the slide?

Speaker 2:

Yes, yes, you may show the slide and I do have a question. As you're sharing it, this should be a share button at the bottom of your screen sharing it. There should be a share button at the bottom of your screen. What fascinates me with AI, as you're showing your slide, is you're mentioning the semantics search, that we have a table and the table is furniture, but nowhere in the database do we say that a table is furniture within our data. How does the search, or the AI, know that a table is furniture?

Speaker 5:

Right. So you see two implementations. The first one, on the left, is implementation of semantic search in a pure AL language. So just in a pure AL language, so just in just a normal cloud extension. And on the right, that what Stefano built in the Azure SQL it's like 20 times faster and you see that it takes, in a sequel implementation, half of the second to search through six thousand five hundred entries. Yeah, so now, actually, how, how do you search? I think I do have 0.31 comma, 1.2 and so on. Right, so this is the numbers actually represent uh, one item, so one item in Business Central. One record converts to these numbers and these numbers are called embeddings. We get these numbers from the Azure OpenAI model and then we save these numbers in a database. So that was a different way of saving this. So I saved it as an AL table and Stefano saved it as a SQL column index store.

Speaker 3:

Column store index yes, Column store index right.

Speaker 5:

Column store index yes. Column store index right. And now to find similar items. Actually, you need to find similar numbers, number one number to another number, and to compare one item with another item, you need to do 1535 calculations multiplied to this number. I'm following. I promise, jeremy, I'm with you.

Speaker 1:

I'm with you, Jeremy when does math come from?

Speaker 3:

I'm following, I promise the Mathematical, jeremy. I'm with you. I'm with you, jeremy, where?

Speaker 2:

does math come into this?

Speaker 3:

Yes, so Very simply speaking. Sorry, Very simply speaking about. So I want only to Continue.

Speaker 1:

Yeah, no, I close the I close the.

Speaker 3:

I close the.

Speaker 5:

I close the. I close the. I close the.

Speaker 2:

I close the. I close, I want only to continue.

Speaker 3:

Yeah, I close my Go, Dimitri go.

Speaker 5:

Okay, so actually to compare search text with all items that you have, you need to do 15 billion calculations.

Speaker 2:

You see this. That's a lot.

Speaker 5:

Yeah, that's a lot. And in AL it takes 10 seconds and in SQL it takes half of the second. Yeah, I'm done, Wow.

Speaker 2:

Can I play?

Speaker 3:

I want to just quickly explain what is that. When you convert a number, without going too much in depth into the mathematical things, but when you convert in AI, when you convert, you have a vector. A vector is a numerical representation of an entity. You convert an entity to a vector. A vector is a numerical representation of an entity. You convert an entity to a vector. That is a set of numbers, and when you want to compare two numbers, it's like a vector is like a representation of an information in the space. So imagine to have two axes and the vector is aligned between these axes. If you have two vectors, two vectors are more similar when the distance between the two vectors is smallest. So if I have two axes like this, they are different than two axes like this. Two axes like this are different than two axes like this.

Speaker 3:

We implemented this calculation that is called cosine similarity, and cosine similarity is a quite standard algorithm in the AI world to calculate the semantic similarity between, for example, two entities. So this is the big part of the calculation. So first create the vectors and store the vector into a database and then, when you type, I want to create an order with a free desk, order with the free desk, the AI calculation, need to translate your sentence into vectors a vector representation and then go into the entire list of vectors that you have stored into the database, comparing the similarities or the cosine similarity and extracting what you mean by writing this sentence. This was the big part of the calculation. I know that is not easy.

Speaker 2:

No, it's definitely not easy. I'm still stuck with. So every word is converted into a vector and then you're looking for the similarities of that word on that vector to determine what somebody means.

Speaker 3:

Yes, a cosine. Similarity is a value bound between 0 and 1. 0 means that two vectors are orthogonal, so not similar. 1 is two vectors are extremely similar. So you calculate this similarity between vectors and the number of the similarity gives you the result. So when you type a search keyword, for example, by calculating the similarity between vectors gives you the top result that matches with your keywords Semantic match, not full-text match.

Speaker 4:

If I may try to distill this, down to because it blew my mind when I started looking at what they were explaining. I had never heard of vectors at all in the context of language processing, so I was completely confused. So I'm not going to share my screen. I'm going to share a doodle which is dodgy on whether or not, it'll work.

Speaker 2:

I see the doodle. You have a graph. It looks like Okay.

Speaker 4:

We might do two axes of squishy to crunchy and sweet to sour. And you know, the language model knows that peanut butter is sweet and squishy and you know sour is pickles and that's also crunchy. So when you go to the language model, I'd like something like a banana. What do you have? The language model also has in its reference database that a banana is a squishy sweet tree. So when it looks at the graph to go from point of origin, how do I go out from there? What is the closest angle and what do I have in my info database that matches that angle of squishy and sweet? I've got peanut butter. That's a pretty close match. So it's 0.87. And it does that. But it does that on so many different measures beyond our ability to even remotely visualize in our head those different dimensions.

Speaker 2:

You put that in words that I understand.

Speaker 4:

And hopefully Stefano Dimitri, hopefully I'd just summarize that real simple but right.

Speaker 5:

Yeah, exactly. So if it's like a two-dimensional thing, like, let's say, you have one dimension which is a fruit, the fruit is a dimension, right, and like, around this dimension, you can have like bananas, you can have oranges, uh, you can have, uh, pineapples, uh, but definitely, um, oh, apple is not a fruit, so it's uh it. Apple is a fruit, the seeds, yeah, okay. So let's say something sneakers, sneakers yes, definitely not a fruit not a fruit.

Speaker 5:

So actually, you can think about how. So how similar sneakers to a banana, how similar banana is to a pineapple in terms of fruit and how similar sneakers to banana in terms of fruit. So fruit is one dimension. We can actually do these calculations in mathematical way and we find this, which Stefano mentioned, causing similarity. But this is like a distance. Distance, you know, it's a number, how, how close banana to an to an orange or how close banana to a sneakers. So we can do this and this is just one dimension. This vector is 1536 dimensions, so it's what AI gives us. So AI gives us these numbers in terms of like, we don't know actually, and nobody knows, but it just gives us these dimensions and we just do the very simple math on top of that.

Speaker 2:

Very simple math. But to go back and to bring it back to what you had created and I have some questions, jeremy, I may need to go back to the doodle In Business Central. You created an AI extension or application that has a semantic search to allow you to create sales orders based upon the products and customers, I'm assuming within your database. Now, jeremy, on your doodle you had peanut butter, pickles and carrots, and peanut butters were soft and squishy and sweet Carrots were crunchy and sweet Pickles were sour and crunchy. If I enter an item into Business Central, that's a table I just enter, you know, athens desk, right? You go back to some of these you know and then some of the other standard data that you have Nowhere in the database. Do I put that a banana is squishy and sweet? Where does that part come from? That's where I'm. I understand now the comparison and the vector of the numbers, but how do we get the numbers for all of these items if no one in the database tells us that they are one of those 1,530 odd vectors?

Speaker 4:

I had been very much confused by that myself. So it's a fair question. And that's where the OpenAI usage is coming into play, the Azure AI elements, because that language model is that giant book Think of it old style, it's a giant yellow pages. Where you go, I need to know what's the number for table. Give me that vector.

Speaker 4:

So when you're creating the Athens table and you're creating all those attributes that go along with it because item attributes are great, we love them when you create that table with all those attributes those are the things that the solution Stefano and Dimitri built in it packages all of that BC information about that table and the variants and the attributes up and says, hey, already established language learning model, my yellow pages.

Speaker 4:

I need the vector for this set of information. And it goes here's your giant string of what was it? 1,500 data points. It gives us that. And then what was being done by Stefano's solution, for example, is in the BC side it would store that vector string and now we just have it because we've looked it up once and in an ideal way. What we were demonstrating is that if it could be stored in SQL natively, it would be even faster, because now we've got that field set of values that we asked for it once and we stored it. But where those numbers all come from is that when we enter that item info it goes out to the language model and says what's the number.

Speaker 2:

Okay.

Speaker 5:

So now when you enter in yeah, so technically, I think you're all familiar with the chat API from the Azure OpenAI. So chat API is very simple. You have an input as a text, you send it to the chat API, you receive back AI-generated, ai generated text. Right, so text to text. There is another model in the Azure open AI which is called embeddings model. So you send as input text the text you are interested in and the model returns you an array of numbers. So this is the array of numbers. So like 1536 numbers every time. So it's the array of a fixed length. It you always receive back this fixed back, this fixed quantity of numbers. Yeah, and then you need to store this and do then a math on top of that.

Speaker 2:

So when you had stored the vector numbers from the central yellow pages, the Azure OpenAI database, in the SQL database for the item, did you build it just on the description, because the items will have an item number, they may have item attributes, they may have item categories, they have all of the fields that we have, right? You have the posting group. I mean we are all familiar with many of the fields that are on the item, as well as some of the ancillary information, such as bill of materials. How do you build the information for an item when you have all of those data points I used I'm trying to sound like I know what I'm talking about here the data points within the database.

Speaker 5:

It's a really great question Because it's the first question that you need to ask yourself before you start implementing semantic search what data I want to search in. So in our case we made it very similar to the standard search that we have in Business Central now, where you can bookmark a table and a field which you want to include in this index. But together we did another experiment, we so we included in the index not only item with a sum of the fields like description, units of measure, remember other fields but also we include what you want.

Speaker 5:

Yeah, so categories, attributes, uh, replacements. Um, I think jeremy actually. Uh, it was jeremy's task to provide us the input. What information should the index? So his app will benefit from that.

Speaker 4:

Yep. So, for example, our demonstration search we use like clothing, because there's lots of apparel companies and you have lots of different sizes and variants and you might need to know if it's ecological cotton or those sort of things. So, like, all of that is information that you would want to add to the informational context of how you search, you know. So I need 12 large eco green sweaters. Well, we know that eco roughly means ecological and one of our attributes was named that. We know that large is a variant. We had some of that in the description. We know that. You know there was a green item. It wasn't called a sweater, it was like a varsity jersey or whatever. So there are things that made it very similar. But because it's using as its basis comparison point, all of that composed information of not only the description but all of the underlying attributes, much in a similar way to, like the marketing text kind of pulls from all of the underlying information to build something more rich.

Speaker 2:

Okay, it's all coming to me now. I think you're bringing it down to where I can understand the pieces. So, to do this, did you use the? You used Azure, you used the platform and you used Business Central, and then you had used Power Platform to also consume an API that was in Business Central to be able to create an order. Yeah, when you were writing this, did you use the AI? Recently, they added what I call the AI library within Business Central. They have the AI code, units and functions. Did you use that or did you create your own library?

Speaker 5:

No, so in AEL implementation we used Azure OpenAI libraries that are a part of the system app, so to get the embeddings. And yeah, so to get the embeddings. There is also opportunity in this kind of AI model to do that, but Stefano's implementation was different, so it was in the sql level. Yeah, so we used their um in the sql. Actually, we do have also um opportunity to call uh azure open ai apis from the sql directly, so we uh use this stored procedures approach to do that.

Speaker 3:

The idea was exactly like this. We have two different version of the same functionalities. One is totally built in AI language, so it's in AI language. We call Azure OpenAI the text embedding model for retrieving the embedding, storing the embedding into a table and then, always in AI language, performing the similarity search. In the second version we do that at the SQL level.

Speaker 3:

So in SQL we have done essentially two main parts. The first is creating a table, storing the embeddings and on this table creating what is called a cluster column store index. A cluster column store index is one of the key points of this implementation because it's a feature that you cannot do from the central but you need to do that in the platform. But creating this type of index on the embedding tables, the cluster column store, is useful for storing large fact tables with large dimensions and it's fast on querying data, retrieving data and so on. So by creating this type of index on the embedding tables we turn up a lot the performances on vector search, similarity. And the second part that we also implemented in SQL was the calculation of the vector similarity. So when someone types what they want to search, the SQL implementation calls a function created at the SQL level that calls Azure OpenAI gives the embeddings of the source that you type and do the similarity search on SQL.

Speaker 5:

Yes, and there were, and this is the part where we connected AL to the SQL. So that was Stefano's API to create this stored procedure that we then can call in AL. And just from AL we got this user interface to type the search and then it goes under the hood and calls Stefano magic and return back the result.

Speaker 2:

This definitely is all magic to me.

Speaker 3:

Why we have done that?

Speaker 3:

Because we would like to demonstrate to Microsoft team that if something is done at the platform, so if in the future we have something in the platform level for doing this type of search similarity in a language in in a future I don't know you, just you can just call a method of an object, I don't know what will logic will be and the similarity search will be very quickly created.

Speaker 3:

So something similar that also the SQL team has done, announced yesterday at the Microsoft Build, they show some new function that will be available quite soon at the SQL level to perform similarity search.

Speaker 3:

So in a single instruction you can perform a similarity search passing a vector, so avoiding all the extra stuff that we have actually forced to be done. So if in the future something, microsoft will use the same features that the SQL team has done for providing 2IL language, something like that, to perform this type of search. First of all, they can implement semantic search natively in the platform and this will be a huge benefit across the entire ERP, I think, because if they will be able to index the entire database and they can we can have a full text not a full text a semantic search ability across the ALP and then if having something done at the platform not always at the top level, like the AL language, but in the platform we can have performances, we can have more easy things to do in IL, something like that. So this was the goal that we want to demonstrate.

Speaker 1:

So now I'm really satisfied.

Speaker 5:

Yeah, so now I'm really satisfied. So it means that our idea proved not only on the Denmark team of Business Central but on the corporation team of Microsoft SQL right.

Speaker 2:

No, that's good. It shows that your thoughts were on the right track, definitely With the hackathon. How many contributions were there? Do you know how many contributions there were? How many teams were there? Do you know how many contributions?

Speaker 3:

there were. How many teams were there? I forgot the number. Honestly, I thought I saw a number.

Speaker 5:

I think about 100 of teams.

Speaker 2:

Yes, and then with the number of teams, do you know which of the submissions were chosen out of those 100. I know they rated those somehow in the hackathon. Did they release a list of what?

Speaker 5:

they would call the top. I never saw that. So what I saw? Only the winner, and also so during the hackathon actually there were, like I don't remember exact number, but more or less about 100 of teams, but also they do have two days of pitch so that teams can present their solutions and as this pitch took only, I think, one hour or one hour and a half each, only like 15 or 20 teams per session could show their solutions. So I I saw about 30 or 35 solutions during these two sessions.

Speaker 2:

How long did it take you to create this? So the hackathon was from the 20th to the 23rd of February 2024. How long did it take your team to put together your submission?

Speaker 5:

So well, actually it took us this time we did have. We did have some proof of concept, small proof of concept, just to end research. So I like some research and some small proof of concepts. We've done before but the main job, we tried to be very aligned with these rules of Hackathon. So we you know our team worked for three days.

Speaker 2:

Did you sleep?

Speaker 5:

Yes, we were. I actually was on vacation at this time.

Speaker 2:

Well, you're on vacation, but did you just work on this, or did you sleep? Or did the AI build it for you?

Speaker 5:

I hope so, but it was really fun just to prove ourselves if it works. So it really gave us energy to do so.

Speaker 2:

Oh, I can imagine we were sleeping. Oh good, I can imagine now the excitement of doing that. You had mentioned that the contributors to the hackathon were able to provide a pitch for their submission. Did you generate a pitch for your submission?

Speaker 5:

Yes, and that actually was also a fun thing, because on our third meeting with Jeremy and Stefano we discussed how we should record the pitch and obviously we couldn't record it at the same time, so we decided that everyone will record his own area, and then the guys just sent me the recordings and I just put it in one video file.

Speaker 2:

This was a very efficient team. I like it and that's the type of processes I like, where you can take pieces of information, everybody can do it on their schedule and then you can piece it together. It looks like one recording that was done all together. Is that available for anyone to watch, or is that something if I wanted to watch your pitch? That is accessible? Do you have it available online anywhere?

Speaker 5:

I have a plan to publish this after we have this conversation available.

Speaker 2:

Oh, excellent, see, I always like those. And another thing, as we're talking about this, you have all talked about a lot of great information when it comes to AI. I could spend all afternoon talking with you all about this, because it's everywhere and you can't go, I think, five minutes without hearing the word co-pilot or AI in some fashion, whether audible, reading or even thinking. Where would you recommend someone go to learn more about AI and how to use AI, and even maybe more specifically, ai within AL or in the case where Power Platform? Do any of you have any?

Speaker 5:

resources that you think that someone could start out with so many resources there. I mean, I think you know, I can tell you from my experience, because when I started this research, there were no documentation at all, so the only the only source of truth for this one was the tweet. So, yeah, but I would say that you need to understand what you, what do you want to build. So if you have some idea about what do you want to build, that's a great starting point, because if you have this, you can start building this. And while you're building this, you can, you know, ask questions to the charge of D or Google that, or watch YouTube videos, but you will get the knowledge about your idea, how to make your idea possible, and during this way, I promise you, you will get so much insight about AI, get so much insight about AI, and I think that working on something is a really cool way of learning AI, and we have a session in the BC Tech Days as well.

Speaker 1:

Nice.

Speaker 2:

When is the session for BC Tech Days that you're going to have?

Speaker 5:

So I will do have two workshops before Tech Days about building your own co-pilot for Business Central using the AI toolkit, and then we will have a 90-minute session with the Haurina from Microsoft, also about doing this co-pilot in Business Central using the toolkit. I don't know exactly. I think that schedule is not prepared yet, but the session is planned there.

Speaker 2:

That would be an interesting session and I'm sure certain that many would like to attend it. The session is planned there. That would be an interesting session and I'm sure certain that many would like to attend it.

Speaker 4:

If you're not able to attend BC Tech Days is one of the conferences that is nice enough to typically provide a lot of their content and sessions on YouTube after the event. So if, for any reason holidays, whatever, have you, for some reason you can't go to one of the best dev conferences for technical people around in the summertime there, then you know they're nice enough to share it, so you should be able to catch it afterwards.

Speaker 1:

That's good to know.

Speaker 2:

That's some great information. I will have to watch that. I thought I was going to call up Dimitri one day and he was just going to do it for me.

Speaker 3:

Have a one-on-one training session on how to use AI. If I can add something about that AI, I think it's a very large world to explore. So there are lots of documentation at the moment because it's quite an odd topic. There are lots of documentation at the moment because it's quite an odd topic. I think that, strictly speaking about Business Central that's honestly my personal opinion AI in Business Central is currently limited by some aspects, especially because in business center we have something like an IL wrapper on top of standard Azure OpenAI APIs. Ai is not only that, and I personally experienced in different AI projects that we have done in my organizations that spans outside of the business center. Strictly only using Azure OpenAI API and stuff like that sometimes is not enough when you have a large AI project outside the business center box. There are lots of more functionalities. There are lots of more tools. Ai platform is also a platform is also scalability. So what I personally would like to have in the future in the platform is not only having IELTS wrappers on top of standard APIs, but having real-world AI objects to use from a language, For example.

Speaker 3:

I'm currently doing some large AI projects, not business-centre-related, and we often use a Microsoft tool called Semantic Kernel Semantic Kernel, in my opinion, is one of the best AI tools currently in the Microsoft ecosystem at the moment because it abstracts the AI layers.

Speaker 3:

It abstracts, for example, the vector database and something like that. It permits you to have plugins. This is the topic. This is something like what I would like to have in a YEL language, so not only a wrapper on top of standard OZero API APIs, but having the possibility to call an object. Like in C sharp, you do kernel dot something. I would like to have something like that in a YEL language. So the platform needs to embed and gives us some objects that we can use, and these objects are not built into the platform, not only calling Azure OpenAI APIs. One of the limitations of the current AI implementation is that we have code units on top of Azure OpenAI APIs and if you have a large AI project, for sure you will have token limits, rate limits problem, something like that. So there are more to know and to use my personal opinion in a complex AI project than that, but we do have now function callings also.

Speaker 3:

We have function calling. Yes, exactly, we will show some of these features next week in Dynamics Minds. Jeremy, I think, will be there too.

Speaker 1:

Yep absolutely.

Speaker 3:

Jeremy. I think, will be there too. Yeah, we plan to have an AI path on this conference where we will talk about standard co-pilot features and then we will also have some discussion about what you can do more with the other tools.

Speaker 5:

Sorry, I actually added a nice and I sorry, I actually added a nice contribution to the function calling and Microsoft already merged it to the main branch, so it will be available in one month in the platform which actually puts the function calling into the next level. Wow, wow, just to share a resource with the in the platform which actually puts the function calling into the next level.

Speaker 4:

Wow, wow. Just to share a resource with the folks that are out here. One of the other things to help the getting started. When it comes to Business Central, my first URL of choice is always centralqai, but my second choice is this page right here, which is akams slash bcall. I hope people are aware of that one. It is a master reference to a lot of the jumping off points and that includes in this article list. We have the AI innovation and co-pilot section, and what that will take you to is all of the co-pilot resources that Microsoft has right now for working with AI not only the co-pilot aspect, but also a guide to some of the development steps. So akams slash bcall will get you to this list and then you can jump off to other places.

Speaker 2:

It's a good reference, thank you.

Speaker 2:

That's a great resource, a great reference to have, and with that, I'm looking forward to seeing if I could catch a recording of the BC Tech Days, as well as other future contributions to AI that you all have, as well as the pitch video that will be released after this podcast is released. So, chris, will have to speed this one up because I'm anxiously awaiting the release of that. But with that, I'd like to thank all of you for your time today I know your time is valuable and for sharing the information about this hackathon. Jeremy, thank you for bringing it down to the doodle that I can understand. I now understand more.

Speaker 1:

And that's what.

Speaker 2:

Dimitri and Stefano had said and I understand the vector now.

Speaker 5:

That's why we had Jeremy in the team.

Speaker 2:

And thank you, jeremy, because you gave me the support that I didn't feel so badly, that I was a little lost at the beginning.

Speaker 1:

It took me to a rabbit hole Because, like right, when you mentioned about the semantic kernel, I had it up because I was curious about it, and then Stefano brought it up. It's like, oh okay, I'm going to read into this now.

Speaker 2:

Today, yeah, I don't know, but with that, you had mentioned that you'll be at the conference. You have some workshops, dimitri, how else can someone get in contact with you to learn more about AI, centralq, the hackathon, or also contact you for any other questions they may?

Speaker 5:

have my Twitter or LinkedIn is Dimitri Katzen. My blog is katatsoncom and you can use Central Queue to ask any of your questions. I'm behind this. I'm typing the answers.

Speaker 2:

Stefano, how about you? How would someone?

Speaker 3:

get in contact with you to learn more about the information that you have provided. I'm always available on social, so mainly LinkedIn and Twitter, or better called ETX. Now Also my website you can reach me from my website. That is demiglianicom. I will be next week together with Jeremy in Slovenia where Dynamics my Conference, where we'll talk about these topics. For sure, I hope to be with these two guys together again in other conferences, maybe to talk again about these topics quite soon. But yes, contact me via social media is the best way.

Speaker 2:

If you're together again with these two great individuals at a conference, I think you need more than a one-hour session or a 90-minute session. I think they should schedule a half-day workshop.

Speaker 5:

No, we just need Jeremy to describe what we mean.

Speaker 2:

Okay. Jeremy can condense it all, which is great. Jeremy, how would one contact you to learn more information about yourself, as well as all the other great contributions you've done for this, as well as the community?

Speaker 4:

Well, I mean LinkedIn and Twitter are both very popular places with partners. There was a great slide recently at one of the conferences talking about where people go to find info, and it's quite different between users and partners. Linkedin seems to be a nice little Venn diagram. In the middle of that, I'm currently announcing a variety of classes throughout the summer in that space and API book and your first 20 hours second editions are both coming up shortly. I'm trying to get that ready for tech days mid-June, so that's going to be fun.

Speaker 2:

Oh great. I appreciated the API book that you released and I'm looking forward to the second edition.

Speaker 4:

You'll get a good laugh at social media, on LinkedIn, of promoting the API ref book too, because they did a wonderful thing. Incoming documents is attachments and attachments is slash document attachments. So there's lots of fun to come and hopefully I will appreciate the extra book sales.

Speaker 2:

That's great. So everyone check out the books as well as contact and see what everyone that was here today is working on, and hopefully we'll see that video soon. Again, thank you all for your time. I appreciate it, and thank you for all that you do, and also now I feel a lot more comfortable with this whole vector semantic search concept. I don't know if I could do it, but the concept at least I understand. Yeah, thank, you.

Speaker 2:

Thank you again and have a good day. Ciao, ciao. Thank you, chris, for your time for another episode of In the Dynamics Corner Chair, and thank you to our guests for participating.

Speaker 1:

Thank you, brad, for your time. It is a wonderful episode of Dynamics Corner Chair. I would also like to thank our guests for joining us. Thank you for all of our listeners tuning in as well. You can find Brad at developerlifecom, that is D-V-L-P-R-L-I-F-Ecom, and you can interact with them via Twitter, d-v-l-p-r-l-i-f-e. At matalinoio, m-a-t-a-l-i-n-o dot I-O, and my Twitter handle is matalino16. And you can see those links down below in the show notes. Again, thank you everyone. Thank you and take care you, you.

People on this episode