Dynamics Corner
About Dynamics Corner Podcast "Unraveling the World of Microsoft Dynamics 365 and Beyond" Welcome to the Dynamics Corner Podcast, where we explore the fascinating world of Microsoft Dynamics 365 Business Central and related technologies. Co-hosted by industry veterans Kris Ruyeras and Brad Prendergast, this engaging podcast keeps you updated on the latest trends, innovations, and best practices in the Microsoft Dynamics 365 ecosystem. We dive deep into various topics in each episode, including Microsoft Dynamics 365 Business Central, Power Platform, Azure, and more. Our conversations aim to provide valuable insights, practical tips, and expert advice to help users of businesses of all sizes unlock their full potential through the power of technology. The podcast features in-depth discussions, interviews with thought leaders, real-world case studies, and helpful tips and tricks, providing a unique blend of perspectives and experiences. Join us on this exciting journey as we uncover the secrets to digital transformation, operational efficiency, and seamless system integration with Microsoft Dynamics 365 and beyond. Whether you're a business owner, IT professional, consultant, or just curious about the Microsoft Dynamics 365 world, the Dynamics Corner Podcast is the perfect platform to stay informed and inspired.
Dynamics Corner
Episode 337: In the Dynamics Corner Chair: Fabric meets Business Central in the Great Lakes
In this episode, Bert Verbeek speaks with Brad and Kris about Microsoft Fabric: what it is, how to use it, and how it integrates with Microsoft Dynamics 365 Business Central. The conversation also clarifies what we do with the Great Lakes—Data Lake and One Lake. They discuss how Microsoft Fabric is a powerful tool for businesses to leverage their data and make informed decisions.
#MSDyn365BC #BusinessCentral #BC #DynamicsCorner
Follow Kris and Brad for more content:
https://matalino.io/bio
https://bprendergast.bio.link/
Welcome everyone to another episode of Dynamics Corner, the podcast where we dive deep into all things Microsoft Dynamics. Whether you're a seasoned expert or just starting your journey into the world of Dynamics 365, this is your place to gain insights, learn new tricks and what it takes to create a fabric. I'm your co-host, chris.
Speaker 2:And this is Brad. This episode was recorded on August 26th 2024. Chris, chris, chris Fabric. Every time I hear fabric, I think of clothing.
Speaker 1:What I'm wearing, yes yes, or sewing how it connects together, it woves together.
Speaker 2:See, this is where some of these names come from, I believe, and also we had the opportunity to talk about jumping into the lake With us today. We had the opportunity to speak with Bert. Hello, good afternoon, how are you doing?
Speaker 3:hi, good afternoon, I'm fine, thanks. Thanks for having me. How are? You uh brennan, chris ah, everything's going well over here.
Speaker 2:Chris and I were just talking about how we need to have, like, a pre-podcast pump-up song or something, because we sit and talk before the podcast and before whomever we're speaking with joins. We need something to rally us up, to get us a little excited, especially the time difference, because it's really early here.
Speaker 3:Wake me up here, maybe the song Eye of the Tiger.
Speaker 2:I always We'll have to try that, chris, later on to see. We're always excited, but I want to just have like an intro or something for us. Just like you know, I can picture like a football game or a sports game where people are in the locker room or the tunnel waiting to run out yeah, pumping the music, let's go, let's go, you know, or some sort of match it would be good, that does always thing right.
Speaker 3:If you play the right music I'm also also doing that with sports, indoor cycling then just a good music and a quite a good beat.
Speaker 2:Yeah yeah, no, it's good. It even works with workouts. I noticed I listened to some I techno music with upbeat music when I work out because it gives me that energy and drive to go through. Sometimes it's a little difficult.
Speaker 1:Do any of you do that when you're like, when you're focusing on work and you play music in the background, or at least through your headset, however, you listen? Do you ever any of you both do that? Yeah, I do that often.
Speaker 2:Yeah, that, however you listen, do you ever any of you both do that? Yeah, I do that often, yeah, yeah, yeah, I do that often. I listen to, to music, um, to kind of just keep me energized. To be honest with you, I listen to techno music, then sometimes country music for a certain point, but then also I found like classical music or symphony yeah, it's good you're focusing yeah, it's good for focusing.
Speaker 3:Yeah, it's good for focusing. Indeed, that's very correct, brett. When I also need to focus and concentrate on one thing, I play always some classical music with no voice, or just the classical.
Speaker 2:Yes it's amazing how it works. It helps you really get into the zone and focus. Albert, thank you for taking the time to speak with us this morning, this afternoon, this evening, whatever it is, wherever anyone is, and we've been looking forward to speaking with you. And before we jump into it, could you tell everyone who's listening who doesn't know a little bit about yourself?
Speaker 3:Yeah, thanks for having me, Brett and Chris. I'm Bert Beek. I am 41 years old, married, have three children, living in a small city in the Netherlands, the same city as AJ Kaufman I recently discovered. The beginning of this year, we were in a conference in Denmark and we said maybe we can drive together. And oh, where are you from? Barneveld? Oh, I'm exactly the same. So we didn't know that for a couple of years we are living in the same city. Over a couple of years we are living in the same city, never met each other there, and I'm working at 4PS as technical solution architect. They are working for 12 years, but 70 or 80 years working with Dynamics, na, nev and Business Central and mostly focusing on the technical part and also the functional part of Business Central. That's great.
Speaker 2:It's interesting. It's a small world, it's a really small world. I come across individuals everywhere that have some sort of connection, and it's cases such as yours, where you and AJ live in the same area. You said netherlands, did you watch the race? Did you go to the room?
Speaker 3:no, I don't go to the race, uh, because the price dates are quite expensive. Uh, I watched only the starting, and that that's maybe because, um, and that's maybe because last season it was perfectly because Max won and stuff like that, and now, yeah, it's a little bit getting annoying. Maybe that's the orange culture here.
Speaker 2:If you know Max is not winning, you don't watch. Ah, I am so happy when Max doesn't win and I'm happy to see that others are winning. It makes for a better race. I mean, it was a stretch of time there not to talk all about f1, but there was a stretch to have time where the you know, even on the television they were focusing on the middle of the pack because it was a better race, because max was always so far ahead. At least yesterday we could talk about it, because by the time this is released if someone doesn't know the results then you know they. They probably won't be watching it. But just to see norris finish so far ahead of max just made me happy yeah, indeed, but that that was very crazy.
Speaker 3:It was, uh, I'm not sure, 16 or 17 minutes, sorry, seconds before it's. It's just crazy how fast Norris was at the second half of the season.
Speaker 2:Oh, I like it.
Speaker 3:I can see, indeed for other countries, that there are some better competitions on it and truly had the better for Formula 1. But yeah, I'm a Dutch guy so I was hating it.
Speaker 2:No, no, I understand, you have to have loyalty. Just as a side note, I did go to F1 Miami, where Norris won his first race and started this whole trend of Max losing, which I'm so happy.
Speaker 2:Yeah, indeed, you know, it's like I I said it makes weird days, but to uh to bring it back, um, you know, one of the things that we were talking about and uh, that's, you know, strikes us some interest, uh, that I believe you also have some interest in is this whole, uh, fabric type stuff. This fabric type stuff, chris, I'm not talking fabric like to make clothes, I'm talking about a different type of fabric, and even more so because I've seen some articles and I've seen some information that Dynamics 365 Business Central now can work with fabric. I don't know what it's making, but it can work with Fabric. So if you would, bert, if you could help shed some light on, maybe, what is Fabric, what is MS Fabric, and then, how does it fit with Business Central and how does it fit in the Dynamics 365 ecosystem? There's a lot there.
Speaker 3:There's quite a lot and indeed I can know that people don't know Microsoft Fabric because it is quite a new product in this time. I thought in May last year Microsoft has launched Fabric and in November last year it went GA for it. And what you see mostly when you cut a data warehouse because Fabric is all about data warehousing and consulting and analyzing your data what the problem mostly is is that the integrations are very complex. When you move or starting to move from, for example, bicentral or Sales, or you have got another solution and you want to move that data to a data warehouse or a database, it is quite cumbersome and difficult, especially when you are upgrading your solutions and one data is stored in one database for analytics and another one in another database. You want to combine it and you have to move all the data and there you can see that people are losing interest Even right now.
Speaker 3:You've got quite good examples in Azure with Data Factory and Synapse Links and all that kind of stuff, and that's what Marks of Fabric wants to cover all Just a very easy to implement and to import data inside Marks of Fabric in it. That's in a few sentences from what Mark's Fabric is Just to have have easy import your data in it, analyze it and then show it to the world.
Speaker 2:In that case, oh okay, so is Microsoft Fabric in essence a different type of data warehouse or a data warehouse in Azure that is intended to have centralization of data or easier centralization of data? I hear of this fabric and I also see. I enjoy the terms because I associate them with many things Fabric I always think of sewing and clothing, and then MS Fabric also has another thing. It's called lakes and one lakes and data lakes, and I think of lakes as swimming lakes. So maybe you could shed some light on the terminology of a data lake or a lake and how it relates to fabric.
Speaker 3:Well, what I always say, what you say is, when you are similar to a lake, is how you can swim in it. In a lake, there are tons and millions of data in it. For example, you can swim in it and it's quite large and you don't know how large it is. And what Microsoft wants to cover is and that's also the marketing function of Microsoft is they call it One Lake Because first, before Fabric, you see that there are quite a couple of data lakes in it.
Speaker 3:Maybe you've got a data lake for your finance data, for your marketing data or HR data, and in one lake that is just all stored in one data lake. Underneath one lake there is Azure Data Lake, but it calls One Lake. So if you're importing the data inside the One Lake and everybody consumes that data in there, so you don't have to export it anymore to another data lake you can easily combine it just from one silo to another silo, and that's the perfect thing about the one lake in that. So what I said, you don't have to have separate data lakes anymore, you can just do it in one lake, and it's mostly similar to what you have in your OneDrive storage. Onedrive is all what you gather around, you put it in there and everybody can use it. You can share it, you can use it on your own, and that's just the one leg for data warehousing.
Speaker 1:In that case, so I guess I'm trying to have a bigger grasp, because back in early days when you're on-prem, you know, you just stand up a separate SQL database and then you're just like, oh, let's just throw everything in there, pull everything in one SQL database, and then you create SSRS, report or connect some other tools. So now that we're online, is this sort of a replacement, or is Fabric sort of a replacement of that concept back then?
Speaker 3:Sort of sort of back then Sort of sort of, because what you see in the SQL database it is all structured, it's all structured data and inside your one lake you can have unstructured data and structured data. In that case you have got a lake house or a warehouse. So an unstructured data is just data that you put in there and you really don't know what it is, what type of data is in there. It is all unstructured and that you are going to convert from an unstructured data set to a structured data set and that you are going to convert from an unstructured data set to a structured data set and there a warehouse is more. Yeah, the same.
Speaker 3:Chris, what you said is on an on-prem SQL server where you then connect all your tools on it. So that's mostly based here on a, a warehouse and what you can easily do also with a warehouse, also with a Lagos, that you can attach not attach, sorry you can see it like in SQL database in the front-end, so you can use your SQL commandlets and all that stuff you can use there. Also you connect with the other parties that can read SQL. But in the backend it is not stored in just tables but it's in a Delta Parquet format in it.
Speaker 2:So with it being unstructured? I hear a lot of structured and unstructured data so maybe we'll go all over the place. So with it being structured data, it's normalized in a sense of how we would think of like a customer table, where it's defined as you have these fields, elements and records in here. To come up with a customer, a data lake would be just data that anybody throws in there, as you're saying, because it's unstructured. But how would they get it in and get it out if it's not structured? Because is it unstructured in the sense that the application isn't holding the structure, but the user of the data is defining the structure of what they're putting in? Because if I had customer data in one system, could I put it in the data lake or the one lake and then retrieve that information out or combine it with customers of another system that may have a different structure. This whole I'm trying to visualize structure and unstructured data and and how we can associate or view it yeah, mostly what you can say about.
Speaker 3:Yeah, sometimes it is quite difficult because when you are, for example, base central, how you've got a customer table, you're importing that in your lake house and then you say, okay, it is unstructured, even though that in BIS Central it is structured because you've got rows and columns and stuff like that. But when you want to report something, you want a better structure on it, so like you've got some kind of dimensions on it. What's your open demands on it? All that kinds of dimensions you can create on there and that's how you combine also with other sources that you put in your warehouse where it is structured and where a user can immediately report on it in that case. So it's more like a structured data that the user can immediately build a Power BI, report on it without any merging or remove columns and create dimensions on it.
Speaker 2:This is interesting. How does that work with performance? See, I know SQL rather well so I'm trying to anyone else that may understand SQL or visualize SQL. So, sql you have the rows and the columns, as we had discussed, within the tables. You can have multiple tables within a database. In large data sets often there are indexes that have information and just a quick retrieval of the data with the columns that they include and the columns that they're sorted by clustered and non-clustered indexes. If you have unstructured data, how does it work with performance when you have a large data set with this? Or is it just magic?
Speaker 3:Yeah, it's always magic. No, no, it is stored not in a table, what you said with the columns and the rows and stuff like that. It is stored inside One Lake in a Delta per K file, and Delta per K file is just optimized for very fast reading for that performance. It is also and parquet file is also not based, what I said, on rows, but it is more based on columns, so you can very easily search, combine columns together and read your data, because if you combine multiple Parquet files and merge it to a structured data table sorry, a structured data set that's also based on a Parquet file. So it's all based on that particular Delta Parquet format and that's why it is reading that fast, faster.
Speaker 2:Yeah, yeah, so with that it sounds like then is so Fabric is more intended to be, as Chris had talked about, where you have information, a SQL survey in early days, just trying to equate it where you copy the data to a data warehouse. So Fabric is meant to be a destination for a lot of data, not necessarily transactional. So the intention of because you had mentioned that the files are meant for fast reading, so you wouldn't use it as a lake to store your data for your transaction, your day-to-day use, it would be where you would offload your data so that it could be read and queried upon. So that's, in essence, how it's used.
Speaker 1:yeah, perfect, perfect now with with fabric, um, Fabric. I'm trying to again trying to understand this fully, right? So when you're implementing Business Central Online and it kind, of people always have that question. It's like how do I report against it, how do I build a data warehouse? So should that be a natural path for organizations? They should always consider fabric to do some reading of reports or maybe utilizing Copilot. Right Now that you have this larger, more of a fabric one like database, Could you call it database? Wait, can you still?
Speaker 1:Yeah, I think you could feel it Okay. And utilizing all the Copilot tools. Is that a natural? Should that be a natural path for organizations when they get to Business Central Online?
Speaker 3:No, I think, and that's what you see also in Business Central. You've got some very good reporting with the NLS views, currently also with the Excel reporting, and I'm really a fan of that one, and I think if you're quite a small company, you can use the Excel reporting and there are also now some Power BI reports that consumes the APIs on it. That's enough for just a small customer or a small company, I think. If you want to more complex data, combining data from Business Central with an HR system, or even if you're using Power Platform with Dataverse on it, and if you want to combine that data, I think, then it's really interesting to go to Fabric in that case. So it's, if you have just reporting and, yeah, reporting is more, I think, about showing a list or combining two, three tables together, and that's your overview Then I think Dissental right now has the tools in it for it. Right now has the tools in it for it. But if you want to one step further, then I definitely want to recommend Marks of Fabric in it.
Speaker 2:Yeah, with Fabric and Business Central it's what we talked about with tools you had mentioned to make it easier to import data into the lake. I love calling it a lake. I always want to think of swimming and fishing, and that's how I look at it. There's a lot of data in the big lake. It's fish swimming around. Is the data we have to drop our fishing pole in to pull out the data? What is built in or what is set up with Business Central that allows for the easy importing of data within or exporting? However, if it goes that way into the data lake or into Fabric, into a data lake or one lake in Fabric?
Speaker 3:Indeed, indeed. It's sometimes confusing all those terms Indeed, all those terms, indeed. But if currently the situation is that you can only consume data from BizCenter into Fabric with the APIs that's the official statement of Microsoft in it and if you have quite a big database on it with a lot of tables and with a high volume of data, then it can be quite slow, I think, because it will affect your NSDs. And mostly when I look at our customers, they said okay, I want to have a two hour import from BizCentral into my fabric in my one day. In that case and that's not quite an interesting story because we've got one customer that does it every each two hours with the APIs and eventually they said okay, we have quite a performance drop when it is running. We don't know what it was running, but in the telemetry we saw okay, hey, but you are using those APIs quite heavily, so just put them off. And the performance went very well.
Speaker 3:And that's also what two years ago the Microsoft two guys from Microsoft has implemented BC2-ADLS, that is, a Bicentral to Azure Data Lake. So they're exporting data to Azure Data Lake. So they're exporting data from Base Central to Azure Data Lake and in deltas. So in that case it doesn't have a full export but only the deltas in it and it was extremely fast for it. It just exported in CSV and in this case it was Azure Data Lake. It will combine it and you can use it in Power BI and that tool.
Speaker 3:I thought, ok, when Microsoft Fabric was arrived, why don't we use the One Lake APIs just to store this central data into Microsoft Fabric? So in that case now B2A2S is also can move data very fast from business central to Fabric, in this case inside your OneLake. So in that case you can consume it. That test, just yeah, there's no source tool, but I definitely and that's a temporary tool, but I'm just betting on another one and that if that is official, then I will be crazy it is mirroring a SQL database. So in that case you very easily in Microsoft Fabric say, okay, I want this Azure SQL Server with this database, with those tables, and it is directly mirrored inside my Fabric space. So with a couple of seconds maybe one or two seconds if you change your business central data, it is there inside your Microsoft Fabric. So that will be fantastic.
Speaker 2:That does sound fantastic and it affords the opportunity to have some enhanced reporting without impacting business central, but you still have more or less real-time reporting. If the, as you had mentioned, I'm hooked on the analysis views. The analysis views, with being able to pivot and export and even save those views, is, what we say sometimes, a game changer. It gives users the ability to, once you have it, once you have a list established, and even being able to create pages off of queries. But now you can take that information a bit broader. It cuts down on the needs to have a lot of these one-dimensional I call them reports where it's always look at this information. In this view, that's the user of the application sort of control, as long as I mean. Again, it's important to understand the data, but they can control the information that they're looking at. As you were talking about this, with the data going from Business Central into Fabric, it made me think of Dataverse. Where does Dataverse or is that an outdated term now fit in all of?
Speaker 3:this? No, no, no, dataverse is not an old term. It is. Yeah, it is part of the Power Platform solution. So if you're using Power Platform, dataverse is the database of it, so that's the storage and there are also some connectors from Abyss Central to Dataverse of it. But Dataverse, what you said, is more for the power platform and also, if you are financing operations, it stores the data inside Dataverse and also with dynamic sales and stuff like that, and also with dynamic sales and stuff like that.
Speaker 3:But Dataverse is not a warehouse solution because it is quite expensive in that case. But you can also create some kind of shortcut to move your data from Dataverse inside Fabric. So in that case, what you can do, what you said, brett, with your analyst views, it is just a one-dimension view and that's really fantastic and really a game changer, I think, also for especially, the end users. For example, if you have got a PowerApp and you've got your BIScentral and both are representing inside Fabric, you can really do quite a lot of analysis on that one. So in that case, it is really fantastic just to have your data from the Power Platform combined with BizCentral and that is really fantastic with your NLS on it.
Speaker 2:So is Dataverse another unstructured database that's meant for a transactional database, or is it? I don't mean to deviate from fabric, but now you have my mind going on this. No, it is With the Eye of the Tiger song in my head. Yeah, indeed.
Speaker 3:No, no, dataverse is more. It is a structured database for it. So it's more, chris, what you said, it's the old SQL database, but then in the cloud on that one. So there you can create your columns, your tables, with a specific column type and with the schema on it and store your transactions in there.
Speaker 2:So back to Fabric with Business Central and hopefully, as you had mentioned, we'll get to the point where it could be mirrored in real time.
Speaker 2:You had mentioned that Dataverse isn't the solution for warehousing because it's expensive. I am one who doesn't follow pricing for a reason because there's no way I could ever understand Microsoft licensing and know what I ever pretend to. I gave up on that. I focus on the technical and functional aspect of it. But how costly is fabric? And then also with Business Central you can have data retention policies to minimize the cost of the database within Business Central. Is it something where, if you move the data over to Fabric, you can report on both? And if you move the data over to Fabric and then you also adjust your retention policies within Business Central to reduce the amount of data that you have within your database, what would be the cost difference for a solution like that? And then also, could you report off of both of them simultaneously? I guess you could, because if you had real-time mirroring into data like one lake of fabric, you could look at the data in one view.
Speaker 3:Yeah, indeed, and I'm also not a licensed guy. I think, indeed, if you are doing that, especially on the Microsoft, you'll get a full day job, maybe more than that one one, because there are a lot of exceptions and it's really crazy on it.
Speaker 1:Isn't it just a moment of general comment on the pricing? I thought Fabric is like it's Azure right you just pay by the hour of use with a certain storage, or something like that as well yeah, you, you pay in, indeed, for the storage and the uh uh excuse, and that's yeah.
Speaker 3:So so in that case, um, uh, chris, that is indeed right. When you have got an sq, that represents a capacity unit and if you multiply it by 24 and a couple of metrics, then you can get, for example, 170,000 seconds of consuming the things. So in that case, when you are merging or combining reports sorry, data on it, it is consuming seconds on it, and that's where the price is in that case. But what I always say because, as you agree, brad, the license guide is really great for Microsoft there is a free app that you can install it. It's called the Capacity Metrics app. It is from Microsoft and then there you can analyze how many seconds or capacity units something is using.
Speaker 3:So in that case I will say just do a trial workspace or fabric and then analyze just one or two months how many seconds you need, and you can then buy it in that case. So then you really know. And a special thing I think is when you are mirroring a SQL database, it is mirrored inside Microsoft Fabric and that storage is free. And that's the beauty of it, I think, because what Microsoft said and that's only because I'm Dutchman and free, it's always good. But Microsoft said, okay, you pay already on your SQL storage, the capacity, you don't have to pay it also on your fabric side. So in that case, yeah, it is.
Speaker 1:For the mirroring component.
Speaker 3:Yeah, indeed, for the mirroring component.
Speaker 1:Yeah, which is actually really cool.
Speaker 2:It's still all confusing to me.
Speaker 1:It can be, it is.
Speaker 2:No, I understand. I mean the way it's all charged in Azure by, in essence, use. It's a pay-by-use, but in my opinion, not to go around going down this road. But I wish there was an easier way to estimate use. It's tough Again, it's how much gas will I use, right, it's in a vehicle, or how much electricity if you have an electric vehicle.
Speaker 1:I think people need to consider when like to Bert, you mentioned, you know, when I think where Fabric comes into play is if you have multiple data sources.
Speaker 1:Again, this is the same concept as what it was in SQL, where you just have different data that's coming in and you want to put it in a centralized data warehouse, and that's when business intelligence come in by looking at other data that you're coming from anywhere from your organization and then using. I think this is when co-pilot comes into play. This is when your power BI comes into play, where they can consume data off of it and be able to have much more of an intelligent reporting or intelligent analysis of your business. And this is where you get answers to questions that you didn't know. You should be asking versus hey, yes, there is reporting in Business Central. Yes, there is, you know you can connect part of BI, but it's usually just more of a. You know a very set data, so it is something to consider. Yes, it can get costly, but I think if you plan it out correctly, it brings a lot of value.
Speaker 3:Yeah, indeed, and that's, I think, chris spot on. It can be costly, but you can also use the data just for decision making and if you don't have that, I think that's also quite costly for your company. If you're doing the wrong decisions, creating the wrong decisions and can bankrupt your company, that is quite a huge cost. But if you spend a couple of money on your data, doing the right analysis okay, what's my trends? And stuff like that, especially when you have got a lot of data in it, I think it can make you quite a lot of money in it. I think it can make you quite a lot of money in that case, those intangible costs sometimes are difficult for someone to appreciate.
Speaker 2:As you had mentioned, you're not measuring the cost of not doing something because you're not doing it and there's no way to say the efficiency, but you can measure the cost of doing something because it's going to cost you a price. With fabric and unstructured data, how does it work? A lot of individuals now talk about AI, co-pilot, prompting, getting information. Where does fabric fit in that space or does it fit in that space?
Speaker 3:Yeah, it really fits in that space because you can also analyze the trends in it, and that's why co-pilot. But more data science is coming into place. That, but more data science is coming into place. So in that case, there's also a part in fabric and I haven't got that quite well around and investigated, but with data science, that's the part that you can use AI to see the trends from quite a lot of data that is maybe unstructured in that case. In that case, copilot can also help you with, because you can also program in inside Fabric to create mergers of tables and stuff like that. That's quite easy, also then doable with Copilot in that case. So in that case, indeed, copilot can quite easy help you, and also with AI. The source of all your data is just in your one leg, so it doesn't have to go around with multiple data sources. It's just there and you can analyze with it.
Speaker 2:So with the co-pilot? See, this is all. I have so many questions that my mind is going all over the place. I need some, you know, fabric and co-pilot to structure, uh, the data, and like I wish I could put my brain into a lake yeah, you know and all the information we consume to be able to.
Speaker 2:To bring the information out, I did just read a book called my Second Brain which pretty much talked about that how you can offload data into a place and quickly retrieve it and then use Copilot to summarize what you're thinking about.
Speaker 2:There's a different strategy the book was written before Copilot, but it was an essence of how to reduce stress and free up space in your brain by using a second brain and just more or less having a cataloging system that you knew how to use so you could find the data fast, almost like your own co-pilot, I guess you could say. But I guess now that you say that, chris, if you could have co-pilot, go against it you could.
Speaker 1:I mean technically right, I mean, if you drop everything about you then you're quite um with all the writing and stuff like that.
Speaker 3:Yeah.
Speaker 2:Well, if you do it incrementally, little pieces over time, I mean you try to fill up the library all at once. But if you start cataloging stuff from point A going forward, it could be good.
Speaker 1:Sorry, really quick on that. Fabric and Copilot. This is where I don't quite understand and maybe I don't know the answer and maybe for anyone listening to us, you know maybe can provide a little bit more. Or at least maybe because I don't use it enough is where Copilot comes into place with your data sets, right, where Copilot comes into place with your data sets, right. And you know, we've been hearing that Copilot. It's really meant to be conversational. It's not meant for analyzing numbers or you know what I'm saying. It's not meant to do that. It's more of a conversation. It's a natural language. So I'm curious how people use it out there. You know when they're using Fabric. You know Fabric can be used for data warehousing. People use it for data science, power BI, and so it's all about these numbers and crunching numbers, but then, on the other hand, copilot's not really designed for that yet. So how do people end up using Copilot when it's just a bunch of numbers in a data warehouse? This is where I'm trying to figure it out.
Speaker 3:Yeah, maybe that's a little bit confusing, but there is also Copilot inside Fabric, but that's more what we're also using in Bicentral and in Vistri Code. How can I write a good Spark language or Python language just to create?
Speaker 1:Oh, I see what you're saying.
Speaker 3:And then you've got the data science, and in that case you have to create oh, I see what you're saying and then you have got the data science, and in that case you have to create your own model, or you can use another model, but that's not text-based indeed.
Speaker 1:Okay, that makes sense. So you're using Copilot to build tools, or how to I extract the data? That makes sense now and then with data science.
Speaker 3:You have to. Yeah, you can use models that are already out there. You can create your own model and enrich it with your data and train it in that case, Got it.
Speaker 1:So they use Copilot to say, hey, I need you to build a Python library for it, or even R language, if people still use that. But okay, that makes a whole lot of sense. You just answered my question. It's like where does Copilot come into play? So that makes sense. It's just used for how to build your tool, okay. Yeah, indeed, that makes a whole lot of sense, okay.
Speaker 2:So then the other point of Copilot or AI and I think we spoke about this briefly is now we're talking about fabric and data and data science and leaks Prompting. What's your take or what's your thoughts on how someone can efficiently prompt AI? I mean, ai is just such a broad term. Ai encompasses many things. Maybe, if you use the co-pilot for Business Central, or co-pilot in a sense of even within your M365 tenant, what is the strategy for prompting to get the proper information out? I just I my mind, just really to be honest with you, after a few episodes ago, when we had the great conversation about AI and I finally understood vectors and how it all works. So you know, now I'm thinking, okay, now that all that information's in there, how do we get that out? I can see you know relationally, how it all sits.
Speaker 3:Yeah, I think prompting if you create very good prompts, then you can really have the good results. But mostly prompting. That is very important and I'm always also struggling with it because I thought mostly a prompt must be very simple and small. But currently what I'm seeing with prompting is they're quite huge on it, quite huge on it. So what they always do is putting an example in it or say on bullet points, okay, this is not what you can do on it, just maybe 10 bullet points, and then say, okay, please don't do this, this or that or use this standard and and stuff like that. And what I do is always a lot of practicing on it and that's.
Speaker 2:That's quite hard so you write a detailed prompt, yeah, telling co-pilot what not to return or what to not use. So you have a large. I've always done like you know. Draw me a picture of a guy planting a garden in cartoon style with a black shirt yeah, is what I would typically write, or something. I mean I use Copilot quite a bit now, even more so for getting information, because sometimes I get a lot of emails. You know threads and I can use Copilot to summarize the thread and it works great. But now back in the context of how you mentioned it, do you have an example of how you would use it with a large prompt like that?
Speaker 3:Well, what I did is probably you know the BIScentral performance toolkit. Yes, and currently we have also the page scripting. In that case, yes.
Speaker 3:So what I did and it is not perfect, I have to get it much better for the prompting is to just record the page scripting tool just to create a scenario. For example, I'm creating a customer and then creating a sales order of that customer, then you can export it in a readable format, and there I put OK, put Copilot on it and create a BCPT scenario of it. And there what I did is quite yeah, created quite large prompt from, if you are seeing this tag, just do this one, or using the test framework of it and this is an example of it, an example from a VCBT scenario that I already created with a lot of AL statements on it and put it in the prompt.
Speaker 2:In that case, that I would love to see is, at one point, just to see you going from the page scripting tool to use Copilot to create a BCPT scenario that can save so much time Indeed In the current version of the page scripting. I mean I'm a fan of the page scripting tool. I mean it helps with a lot of testing. I mean I'm big on automated testing and I write a lot of tests and having the page scripting tool helps. Hopefully one day I'll have to put it on my wish list that we can have some of that automated to create a library of tests there as well, along with like we have with the automated tests, have some sort of runner. But that is, you need to write about that. If you haven't written about that yet, with some examples or even a little video, that would be.
Speaker 3:Yeah, I presented before the summer on a conference and indeed, that's's true, I need to try something on it, but I need to do perfectly that's. But indeed it. It is quite a lot of time to take a BCPT scenario and have. What I said is I love also the page scripting and if, if it's automated, especially for customers with some customizations on it, it is really perfect. And I think those examples are quite good for CoPilot on that case, Just to have some clear statements on it. The BCPT scenarios are quite the same, with the same syntaxes and stuff like that, and structure is the same. So I think those scenarios are quite good. Sometimes for me it's also quite hard to create scenarios inside BIS Central for using Copilot on it Because, yeah, it's always text-based yeah, I mean it, I think it's almost, I think it's really more.
Speaker 1:I think it's more important now to uh understand the prompt structure. Um, and there's plenty of, there's plenty of online uh classes that you could take to do what they call prompt engineering. So you know, as soon as you get the basic concept, like you could do a lot of stuff with ChatGPT or Copilot, where you structure like, hey, you set a scenario and you know when you respond. You have to respond in a certain way, and so your prompt can get pretty big, but you get a much better result than, rather, being too creative where, like, oh, I'm going to interpret based on what you just said. So, if I can't remember who someone had mentioned, that you should take some prompt engineering classes, just so you can have a good understanding of the structure.
Speaker 3:Yeah, indeed, and I think probably the next job this year or next year will be prompt engineering. If you are doing that good, you can get a lot of money on it.
Speaker 2:Yeah, you can do so much with it. It'll be interesting to see where this all goes. It's the hype right now, but I wonder if it'll end up becoming commoditized over time as the dust settles. I guess you could say it's. You know, copilot will be more of a I think, a tool than a product. I guess that's just always my thoughts. Yeah, it is quite a bit to it.
Speaker 3:Totally agree.
Speaker 2:Well, bert, thank you for taking the time to speak with us this evening. We do appreciate it, and you did share a lot of information and enlightened all of us on a little bit more about what MS Fabric is, how it can be used with Business Central and what the hopes for the future with. Maybe, if we can get some of that mirroring and as well as the creative tools for ai prompting with the page script uh over to bcpt uh to create a scenario. I have to experiment with that or maybe I'll catch up with you after you can share one of your prompts of what you did so I can see how it did it I have a quick, quick question for you.
Speaker 1:But do you find, because you work with clients, and so do you find clients, um, still confused with fabric yeah, they are still confused with fabric.
Speaker 1:What it is indeed okay, yeah because they're like you know the k, I got the business central and now you're asking me to go get fabric right. So I'm just curious from your perspective of how that looks like for you. Because it is, it is a challenge to educate you know what is, what will fabric do for you? And usually I try to uh relay where, like hey, you remember data warehousing. You know, especially if you have an organization that has a lot of data coming from different places, it makes more sense to do that and usually they understand it at that point of what it used to be and then what it's going to be in the future. So, just out of curiosity, it sounds like it's still a challenge of customers not knowing what is Fabric. And then you see one lake. One lake is another term being used.
Speaker 3:There are a lot of terms on it and that's always confusing on it and especially when they're changing the names, then it's getting more confusing. And, yeah, the bigger customers, okay, they understand it because they follow it and they have analytics on board, but with the smaller and medium customers, yeah, they really don't know what it is.
Speaker 2:It is difficult to keep up. I mean, the rate of change with technology to me is progressing faster. Rate of change with technology to me is progressing faster and, as you had mentioned, throwing the name changes. It's really difficult for someone to understand or appreciate the value of it with everything in essence being relatively new, even if it's been there for a couple years, also not knowing if it will be here tomorrow or not. If you look at a lot of these technologies that are tried, some of them last, some of them last, some of them fade away, some of them morph into something completely different, I think, over time. But it is. It's a good point, chris, that the challenge of now I have Business Central and now I also need the fabric and that Burt your points of this. Someone who's a small to medium sizedsized business may not even need it. They may have enough with their business central to not need it, to not even know about it, where a larger implementation will have the talent available to understand how it could benefit the business.
Speaker 1:I think that's a good point.
Speaker 2:It's a good approach to way to look at it. So if anyone wanted to reach out to you, Bert, and talk to you and maybe ask you any more questions, a little bit more about Fabric or AL prompting, what's the best way to get in contact with you?
Speaker 3:You can find me on LinkedIn, bert Speeg Twitter. I'm also there and I've got my own website, my own website, bestbakenl, and there you can also find my contacts on it.
Speaker 2:Excellent, excellent.
Speaker 1:Do you go into any conferences anytime soon?
Speaker 3:I hope directions EMEA that I'm there in. Vienna. I think that's for this year. I think probably's for this year.
Speaker 1:I think Okay.
Speaker 3:And probably you are going to Days of Knowledge Atlanta.
Speaker 1:I won't be able to make it, unfortunately it's overlapped, but I've heard. I mean, I think everyone should be able to check. You should be checking it out, especially from a technical aspect. You know answering questions like hey, when do you use? You use fabric, right. So I know brad will be there. So, uh, you know one of us is going to represent.
Speaker 2:Yes, absolutely, I'll be at days of knowledge in atlanta. We're coming up into the, the conference time. I know, uh, chris had asked you about the conference. It is tough to choose which conference to go to, so it there's a lot of conferences available, so it's just a matter of finding the right conference. That fits your target and with days of knowledge. Unfortunately, Chris and I won't be there, but I'll be there, Chris is also going to.
Speaker 1:I will be at the Power Platform, which they'll also. I think they'll be speaking about Fabric 2 and how they use that, so I'll be there. I'm excited about that and I won't be.
Speaker 2:So I'll be there I'm excited about that and I won't be.
Speaker 3:So you'll be representing on that one, and I'll be doing the other one with split-nose conferences and then afterwards share the knowledge of it.
Speaker 1:Absolutely. Yeah, we'll have to do a recap for Days of Knowledge, yeah, but also for our platform. We should do both.
Speaker 3:I think it's really interesting, both because data knowledge is also here in Europe and I've been presenting there and I think it's a really good concept. On it, just very small and you can speak to everyone.
Speaker 1:And I've never went to the Power Platform conference so, oh, this is my first one, so we'll see.
Speaker 3:That would be very interesting about your recap, chris.
Speaker 2:Yes, Chris, we have to. I won't say it, but maybe we could try to do something live. I don't know, we'll have to try to work that out that before, and it gets difficult with the internet connectivity at some of these conferences to do something live. But yeah, you know, maybe we can experiment well again. Thank you, bert, we do appreciate your time, uh, speaking with us this evening. Uh, it was extremely valuable and uh, you know, any time that anyone spends with us we appreciate, because you don't get that time back. Uh, thank you again. We look forward to talking with you, sir thanks for having me.
Speaker 3:Brad and Chris thanks a lot bye.
Speaker 2:Thank you, chris, for your time for another episode of In the Dynamics Corner Chair, and thank you to our guests for participating thank you, brad, for your time.
Speaker 1:it is a wonderful episode of Dynamics Corner Chair. I would also like to thank our guests for joining us. Thank you for all of our listeners tuning in as well. You can find Brad at developerlifecom, that is D-V-L-P-R-L-I-F-Ecom, and you can interact with them via Twitter D-V-L-P-R-L-I-F-E. You can also find me at matalinoio, m-a-t-a-l-i-n-o dot I-O, and my Twitter handle is matalino16 and you can see those links down below in the show notes. Again, thank you everyone. Thank you and take care.