YTread Logo
YTread Logo

Entity Framework Best Practices - Should EFCore Be Your Data Access of Choice?

Apr 06, 2024
If you listen to Microsoft, it seems like the

entity

framework

is the only logical

choice

for

data

access

. Almost all

data

access

examples use the

entity

framework

. Yes, if you've seen any of my videos, you may have noticed that I don't use the entity framework. In fact, I actively do it. discourage people from using it. I get a lot of criticism about the decision. I also get questions about it. The performance improvements of the EF core make using the entity framework finally worthwhile in this video. I'm going to go over a very basic basic entity framework setup. and talk about some

best

practices

to implement and pitfalls to avoid when setting up

your

entity framework.
entity framework best practices   should efcore be your data access of choice
By the end, you

should

have a better idea of ​​what it really takes to create and maintain an entity framework data access layer in

your

application and here's a quick hint, it's more than it seems when you first get started. I'm sure you'll have an opinion on some of the things I'll share in this video because I'll be sharing my opinion. I love hearing your opinion. I know from the comments what your thoughts are, but first and foremost let's agree on some ground rules, be respectful to each other, angry rants and insults will get your comments deleted and you may even be banned from the comments section of this channel if the opinion of someone on how. developing better C-Sharp applications makes you angry.
entity framework best practices   should efcore be your data access of choice

More Interesting Facts About,

entity framework best practices should efcore be your data access of choice...

I recommend that you really establish your priorities in life. Second, if you have a disagreement, clearly state why you disagree. We can all improve our understanding of software development by finally calming reported disagreements. Be willing to disagree or even be wrong, both are okay. I know I've grown as a developer because of the things I've done wrong and someone called me out now. If this is the first video of mine you've seen, my name is. Tim Quarry, my goal is to make learning C sharp easier. One of the ways I do this is by teaching context.
entity framework best practices   should efcore be your data access of choice
There are logit orioles that teach you what to do. Go a few steps further to show you when you

should

do it and why. should or shouldn't do it, what pitfalls you should avoid, and what

best

practices

are. This video is a great example of how I do that now, basically preparing you for the real world. If that is the type of training you are interested in, subscribe to my channel press the bell icon to receive a notification about my release of new videos that come out every Monday and Thursday finally in the description there are a lot of links there is Lansky the source code for this video to sign up for my mailing list to join patreon and get access to my courses and all the links I mention in today's video.
entity framework best practices   should efcore be your data access of choice
If you're looking to make a career in C-sharp or improve your C-sharp career, my content can propel you forward faster. Look at everything they have. to offer on I'm Tim quarry comm or SAP my mailing list to hear from me directly okay, let's go to Visual Studio and configure our environment. Now we are going to start with a web project, so we will create a new project. The SP Neck Core web application is next and we will call this EF demo web, how about that EF demo web and then the application walkthrough. The name of a solution is EF demo app.
You will click to create and reselect a web application which is the Razor Pages application with dotnet. core and asp net core 3.1 no authentication yes HTTP no docker okay this is just to have some kind of UI to work with with our entity framework so I say create and one just loads we can close it . This is a very basic project. you've probably seen about 184 but essentially it's just a web app, if we run this we should get a home page with just a privacy policy I think so here we go we'll make sure it runs it works with web EF demo, there you go, woohoo, now.
The next thing we're going to do is add a class library, so right click on the solution and say add new project era, select the standard dotnet class library, so there are my recently used templates so everyone can select them , but you can search here for the library and just make sure the chosen library is dotnet standard, okay that's important, not dotnet core, not dotnet framework, especially dotnet core worked fine, but I choose dotnet standard because that is the default library right now that you should use and the reason is because dotnet standard will work with dotnet framework projects, it will work with dotnet core projects and it will work with xamarin products, it will work with pretty much any project in the microsoft ecosystem so it is a library of classes very, very versatile, while dotnet core is a little more limited, it won't work with dotnet framework, so the dotnet standard for now is the way to go and we'll call it our EF data access library.
Press create and we delete class one. Okay, done, we have our class library. add a reference, well what to expect to make a reference? Do we really add some classes to it? So let's add a folder to our class library and we'll call these models in the models folder. Let's add a class and call this person to start, you know. Let's start with the address first. Is there a model of depressing management, not direction? It's a slightly different naming convention, but used more loosely with the Antony framework. I try to adjust my conventions to what I'm using.
Okay, so let's create a quick model. prop tab twice to get our property and where I start with an id and then we'll have a string for a street address and we'll have a string for the city, a string for the state and a string for the zip code, now you're wondering if you're in the US a zip code looks like this, you know five numbers, so it looked like it would be a number, not a string, but a zip code can also look like this and it's a string, and then if you're in Canada, I believe you. it has letters so in Japan we are on this is not a number but even in the US it is not a number because we have things like that it is a valid zip code or close to it so it leads to zeros, This is not what a number looks like. something like that, but that's not a zip code, since it has five numbers, so you have to fill in the first two, so each one stores the zip code as a string.
We also don't multiply zip codes or add or subtract, so it's not really necessary. like store as a number, we store it as a string, okay, a little side note, okay, so there is an address model that will hold an address, let's add another class and call this email and they will become public and a identification and we will call each other. email address that's all you have there and you're going to have one more model and this is going to be the person model, it's a public class person and we have an i or an int here which is an id and we're going to have our first name, last name an age and then a list of address patterns called addresses.
I'm going to make sure it's like this so you start adding to it right away and then a list of emails called email addresses that also instantiate that. Okay, so there are three models that contain related information. you have a person and a person has a first name - last name an age you wouldn't normally store the age as an integer in your database this is for example purposes and then I have a list of addresses why would a person have a list of addresses well if you have a summer house and a winter house or if you live with your father for half a year and your mother for half a year or you know that there are a lot of different reasons why you might have multiple addresses of your work your home address again there are many reasons why you may have multiple email addresses.
I probably have twelve of them that I use actively and I have more that are occasional use, so one email address wouldn't be enough, but a list of email addresses would work fine for me, okay, so pay attention to that kind of thing when you have the setup, make sure you don't just have one address. I've seen this done before we had one address and then they say oh this person has two addresses so we have to add another property called address two and then another person has three addresses and before you know it you've got us. added three and four and five and six and that gets silly, so don't do it. do that, that's where you have the lists, so now let's start configuring ourselves to use the entity framework, so let's right click on our dependencies on our class library to manage NuGet packages and we're going to install the sequel server Microsoft entity framework point.
Sorry, Microsoft Entity Framework or I have the Sequel server, so we want to install it, we will configure it to use the Entity Framework with the Sequel server. Now there are also other options if you want to use Sequel Lite or my sequel or other options than EF Core. admit, okay, but let's go ahead and install the entity framework core. Okay, now that we're doing this. I want to point out that the tools here are really impressive and that's one of the things that people often get confused about when they listen to me. Talking about entity framework, they often think that I'm not a fan of the tools and that couldn't be further from the truth, but the tools are great, they allow you to quickly generate powerful data access code with very little work and provides great quality. features like database source control rollback and a lot more those types of features, so when we talk about the tools here, I'm really impressed with the tools, that the tools are not what I have the problem with, so As we move forward, just try to separate your mind from the tools. who uses the tools, that's the problem I have a problem with, it's not just the fact of what the tools do, but how they are most often used and how much there is to take into account, getting them to use them , make them use them in the right direction or make them use them as efficiently as possible, let's put it that way, okay, sorry, my tongue is a little tied, so let's right click on our class library and create a new one folder we'll call just data access and here I am. create a class.
You could call this class the people context and context is a pretty common keyword for entity framework, so public class people context and that will inherit from the dbcontext checkpoint to add the use statement there for let's have our people contacts, which is inherited from dbcontext now, if the first thing I'm going to do is create a constructor for the public people context, I'm going to pass database context options, call them options and then we'll call the base options, okay, and what this does is it gives us a constructor if you want and it also calls the base constructor with those options that have been passed in, okay, so we're left alone.
I'm not actually going to complete that with any code, but I want to have it. configure now this is where we essentially configure our tables, so the public database is configured for the person checkpoints type and our user there, so for the person it will call this people and this is a property, so it should be in a prop, but set it up like so prop DB set and this is for address and we'll call these addresses and DB set for email will call email addresses so there's our three essentially our three tables, you can think of them since these are our data sets, so again he did it.
It doesn't take much to set this up and this is pretty much all we have to do on the code side, we also have the entity framework set up nicely, let's almost do the last part of the code which is go to our interface and here it is where I have my first moment of apprehension my first moment II I'm not sure I like that and that is that we have a separate interface to finalize the entity framework configuration. There have been some changes recently, I think they have improved this and you may be able to get away with not having an interface, they have been working on it but I haven't, I haven't seen where it is, if you know how to do without the friend, I love it Listen to it, okay, but the way to do this is to add a reference to your library on the front end and then on startup, we're going to configure this, so in configuration services we'll add our database context, so up here We're going to say services dot add database context and then we're going to say people context checkpoint to add that use statement so that people context and then we're going to say options and our arrow function here and inside here I configure options so that the options dot use the sequel, wow,use sequel server control.
Notice here using Microsoft entity framework core. This has to know about the entity framework or this is a front end again, I'm not a big fan of that, the front end now knows that we are using the entity framework. core and how we are using it, so the configuration point gets the connection string and the connection string will be. I'm just going to use the default value, okay, so what is this doing? This is configuring our database context or adding our database context to our dependency injection system and it is configuring. to use sequel server and mr sequel connection string is pulled from our configuration which is this JSON application configuration now you need to add that connection string plural connection strings and then here we can say default and have our connection string now.
I'm going to use a local database which you're going to go to a sequel server object explorer you can probably find a local database and the nice thing is if you already had a database you can take that database let's take Oh, never mind, delete, use well if I select delete. I and I say properties, you already have a database here, one thing it does is you can copy this connection string. Seaside connection string right here you can double click copy and then paste the connection string and it gives you all the good stuff that you need. in fact, more than you really need, you don't need to have all these things, the key, that's all you need are those three pieces of information, the data source, the initial catalog and the integrated security, it's the same to true, so let's talk about these three things.
The data source is the location where your local database is located. You can type this manually. You don't have to copy and paste like I showed you, you can type it by hand, but it will be different depending on your environment and this is mainly what trips people up, they copy my connection strings and then wonder why not they do it. It doesn't work well, the connection string is based on your local dB installation. If you don't have a local database and have it installed, this won't even work if you go to tools, get tools and functions.
I won't do it here, it's necessary. a bit to load but that allows you to set up Visual Studio and look for Sequel Server Data Tools as one of the options, one of the big ones check the box and that will install the local database for you from there go to the View menu .and make sure to select the Sequel Server Object Explorer to open the one here for me and then you can connect to a server. See this little add server button here and you'll get the punch in the credential. It will search for items locally.
If I'm going to select this dB local CMS, I can hit connect and it will connect automatically or you can find others on the network or even sure, once you connect, you'll see this name right here where it says. Local DB and parens/ms SQL Local DB that is my server name everything from parens to dB so that is the data source here that will be placed next we have the initial catalog this is the database , now the entity framework code. First, that's what we're doing or code first and generate a database. The code first starts without a database.
You don't have a database, so we don't want to use an existing database like deleteme that we want to create. ours, so let's create our EF demo database, let's just check to make sure that there is no EF demo database, that there is an EF contact database and EF data, and that's everything, so there is no EF demo database, which is good because we are going to create it. type capital B so we have our new database which will be called EF demo DB, finally built in security equals true, this means whoever is logged into the computer uses your credentials to log into the database automatically, now you don't know your password, you're not using your password, you're using your authentication token, essentially, that's kind of a stripped-down version, so it's saying, "Hey, you're already logged into Windows and therefore, "You're going to use the fact that you signed in and told Windows who you are in Windows." says yes, that's who you are, we'll take that as a good sign, that's who you really are, so we use that person as the personal login to the database, which is great for an environment demo, but unless you are on a local network. has an active directory that you probably can't use in production in production you'll probably use a username and password for your sequel server.
Now I'll come back to that in a moment, but that's a pretty common thing, in fact, you should do it. Do that almost always when it comes to connecting to a database, you usually connect using a username and password, not built-in security, so we'll talk about the downsides of Anthony Framework specifically when it comes to using a password, but just Note that all database connections use that username and password unless they use this security built into a local network. Ok, with that we have a connection string, we have our database context in our services and we are pretty much ready to go. a code perspective to start using the entity framework, but we're not done yet.
Now we need to create our actual migration scripts and then create our database. What are these migration scripts? Let's make a quick modification, let's remove the age. Let's say we forgot to add that, okay, we'll come back to that in just a minute, but now let's create our migration script for our data access library so that it has a package manager console if it doesn't have a package manager console. you can go to the View menu or then go to other windows and down here is the package manager console so open it up and it has access to the entity framework and you can do things like this where you say add - migration and then you do the migration. a name, so this first migration is the initial creation of the database, I hit go and it's an oops, let's still not find out why and it says ad migration is not recognized as main of a function script file command or operable program, why is it okay if? they have an announcement for the tools for my project, so in the web project again here are places under Willows where I move around a little bit, right click on the dependencies and say manage new packages, search and search dot tools core of the Microsoft entity framework, these are the tools for The new package manager console and Visual Studio are okay, so we will install these tools and what this will allow us to do is from the package manager we can add the migration.
Now that we're done, a package returns. Admin runs the same command again, it will build our project, make sure it compiles, so if your project doesn't compile, this won't work, so now it succeeded, it says no and says, "Hey, your target project of the EF demo website, which is our startup. The project does not match its migration set, which is the EF data access library. This is the next area where there's a little bit of a factor where, for me, by default, the entity framework really loves to be in your UI, it loves to be right next to it. its UI is built into it, in fact many tools around web projects talk about the Anthony framework now, if you have ever worked on the idea of ​​separation of concerns or loosely coupled applications or if you have read any of my content , it will.
Know that that's not really cool because then your project is your UI, it's your entire project, it's everything, so if you decide to change your UI you pretty much have to change your entire project and that's not cool now you can probably reuse some copy of code. and paste whatever, but it's a major overhaul to replace your UI where, if you have that separation of concerns, your UI doesn't really care where your data comes from, it just says give me data and your access layer to data says oh here you go and gives it to them, so if you're going to replace your UI, it's just that top layer that you're putting in, you're not replacing your business logic, you're not replacing your data access code and there It's where I'm not.
I'm a big fan of how this was originally designed and built. There have now been improvements to being able to separate these two. Warning. I put the entity framework in a class library and as soon as I changed this dropdown to say the default project is EF data. accessing the library I think this should now work, but there we go, so it's done, but it's those little things that actively encourage you to put the UI because not putting a UI is a little bit harder and that is one of the concerns. I have and this is my biggest concern with entity framework: you have to know a lot to be an entity framework developer.
In fact, I would say you have to be advanced, you shouldn't be, you should be advanced. user before starting to use the entity framework, the problem is that the entity framework is so easy to start that it becomes a beginner level data access capability and that is dangerous and that is where I have the problem. Remember earlier I said the tools are great, it's Thule. It's great, the problem is you, your design really needs to be in the hands of an advanced developer, not a beginner level developer, and that's not a keeper. I don't like to keep it.
I don't ever want to say that you can't because you're not a special club, what I'm saying here is that it's very, very dangerous to use the entity framework until you understand some of the advanced parts of it, so let's dive into one of those advanced parts and that's the migration script, so what? A migration script is how the entity framework builds its database, so it starts from no database and then migrates to a database and then as you have more changes it migrates from the current moment of your database to a new point in time and that's what these migration scripts are.
Note that we never have the migrations folder now and here we have this new file that starts as 2 0 2 0 0 1 1 1 0 3 3 6 1 8 underscore initial database creation notice. the final part there, that was my name, I gave the migration, that's the name there, so this is a migration and this is an automatically generated code based on our database context, our people context here and the objects that our database sets inside it, so what is this? Well, there are two methods here and this is great, again, the tools are great. I love the tool skills it has. It has two methods, one up and one down, now just a random guess.
What do you think? these do fine, jump migration allows you to go from where we are to where this wants to take us, downward migration is a rollback, so if you've gone you don't need the top command and go OOP, that doesn't work , you can go back using a down command and that will undo what you did and that sounds amazing and the reality is quite amazing, however, you should know that that down command is not all rosy because in our example the down command for this migration The script is to delete this table, delete this table and delete this table, basically deleting all the tables in our fledgling database, which is fine if you just created them, but if you already had data in them, these no longer exist, so it's not like these rollbacks don't exist. cost, there may be a time when you lose data because you do a rollback, so you should know that it is not a magic solution, you must be very careful when doing a rollback, it is not something that can be done lightly, well, that It's the down command where it says release these three. tables, that's the rollback, let's see what we do to create those three tables, well, we have created the table, the people table and it says columns, now all this code is written for me, which is great, I don't have to write all this syntax or I even know all that syntax and sometimes people say well you don't even have to look here because who cares, but here's the deal, this goes into source control, do you know why it goes into source control? ? because you have to maintain it, yes, something else.
I wrote this code for you, but it is your responsibility to understand and know it because this is your data, not Microsoft's data, this is your database. If something goes wrong, you're on the hook, not Microsoft, so let's read the people table. We have an ID cool int cool not null does Knoblauch is false annotation cool says server identity sequel one Kyle one what does that mean good by default what it's going to do is make this a column of integers that we already call inch that's what we justremove, but then it says this is an identity column because you call ID and this identity column will start at one and automatically count, that's perfect, that's exactly what I do, so the fact that it did it based on the fact that I just had int ID in is great, I love it, it's great, the next column in the name table is a string, they can be overridden, okay, it's ideal because first name and last name can be null.
I don't want them, you know, address will fix that in a minute, but just keep in mind that these are now the default constraints, well, we have a primary key and that primary key is based on the ID, so we convert a column of ID in an identity column and that made the primary key cool, that's exactly what it would do in the address table, well I got the same ID. try your postal address, it is a string column, it is nullable same as city state zip code, all are string columns and all are nullable again.
I would make these non-null, but other than that it looks good, so we have this person id, well we don't have a person id in the address column no, there is no person id here, so how about where does that come from? Well, in the people or the person, the people or object table, we have a list of addresses, so, oh, you have a list there, well, we are. We're going to put the ID from the people table in our address table so that it links well and that's an ID and it can be overridden which means we don't need to have a person ID, well that's a problem for What ever we have an address that didn't.
It doesn't link to anything, so that's another problem, so we have below the same fixed address table. We have a primary key. Great, let me have a foreign key. The foreign key says from the address table. Link to the people table. The ID of the person. Almost exactly the same naming convention that I use, which again is awesome, now part of that comes from using a standard naming convention because then your convention matches everyone else's, but what it says is bind the column of Person ID in address table to people table. ID column great if the foreign key says delete restrict what that means is you can't delete a person if they have a linked address again great let's move on to email addresses it's the same basic story we have our id that's the identity we have has an email address which is a string, it is null, which again doesn't make sense because if you have a null email address then you can have one entry that is the only real data here, the rest is the record id and the Identify the person we're linking to, so there's no reason to have a null email address, but yeah, marked as nullable, okay, again, the primary key and a key external because this is linked to the people table, then we also have an index for the person's ID address. so we are indexing the person ID column into the address table and the email address table and what this does is create an index to make it faster to query data based on that ID, which is also quite well.what is created in our first migration script now, before we continue, I'll go back to the people table and say, oh, you know what I forgot.
I wanted to add age, let's put the age back in now, what do I do because the age? it's not in the table, well I buy a package manager console, I say add migration but I give it a new name, it adds an age column, I create a new migration, okay, and now I have a new migration, so now We have a second one here and this one. one says above is to add a column called age to the people table, it is not null and the default value is zero. below is to remove the age column from the people table, so now I have two migrations and we will apply them in just a minute.
I just want to show you that you can have multiple of these and you will have multiple of these as you make changes, so every time you make a change to one of the models which is one of your tables, just make a new migration added in the meta of the package. Consult and create a new migration for any changes you made. Now there's one more value that it doesn't have or one more class that we haven't talked about and that's the people context model snapshot. Now at the top it says this has been auto-generated, so this is an auto-generated c-sharp code and this is a snapshot of the model, so you have some settings here again.
You don't necessarily have to know all the information that's going on here, but it's very important to understand how to read it because if we look here, let's start with the entity, the address model that it contains, what comes down here, it says the property ID value generated in the ad, well that's the identity value in the sequel, that says it starts in an account every time that's the value Jenner Ana add method that's what you're doing it has column type of int cool it has annotation and there's the annotation that says your identity column is fine, so at least it's understandable.
Next we have the city property, which is a string, it has column type. Here we have to specify the actual sequel column type for the city, that column type is n VAR char max. Now, if you're not familiar with sequel server and how it works with different column types, let's talk about what n VAR char max actually means, so first of all, n VAR char, what? Well, it's a field. of Unicode variable length characters, so Unicode is a broader character set than what we normally have in the US so in united states This is our character set that we use in this screen so it knows from uppercase to Z, lowercase and some special characters, but in the rest of the world there is a wider range of characters used, for example, any of your Arabic languages.
Don't use the ABC alphabet like Asian languages ​​and there are many other cases where characters go beyond ASCII. Well, ASCII I think is 65,000 characters to be something like that and that's not enough. And? Instead, we use Unicode because Unicode can be, I think, millions of different characters, including emojis, if you want to have emojis, so Unicode is much more popular for storing data that may have to do with any kind of international character set. or even non-ASCII. that's a good thing, but it takes up twice as much storage as ASCII or varchar' stores because Unicode takes up two bytes per character while ASCII takes up one byte per character, so keep in mind that n varchar' will take up two bytes per character, that That's good, because in reality, when it comes to storage, storage is usually no longer the limiting factor.
Typically you put a number here from one to four thousand, four n bars, and that would indicate the number of, I guess, roughly equivalent to the number of characters you can store, it's not a one to one relationship, there's a little difference there, but think of it as characters, so one to four thousand characters would be in the n varchar column, but what if you wanted more than four thousand characters? Well, then you would have n varchar' max n varchar' max can hold I think it's two gigabytes of text that someone put in, it's something like a hundred and forty-one copies of War and Peace written something like that, very, very, very, very large .
However, the sequel server has a limit on how many bytes it can have in a given row, that limit is eight thousand and sixty, so once you hit eight thousand and sixty bytes in a row you have a problem and what they have fact is to say Okay, for n varchar' max and for varchar' max, what we're going to do is, if those are large values, we're going to store them on disk instead of in the database and we're going to put a pointer to where are. a disk, this makes it very easy for the sequel server to stay within that limit of thousands of 60 bytes per row while still having massive amounts of data per row, however it causes some problems.
I know I'm going into a bit of a sequel here, but it's important to understand this when working with the entity framework, so if you store a value that is a maximum value or a large value in this field, it will be stored in a location different, which means that the query for this row will be different, in fact, it is much harder to do things like find values ​​within a max envier because it goes on disk and therefore may even indicate that you can't doing so or it can be a much more expensive process for searches that are now likely to be for the city you are searching for. will not put more than 4,000 characters, in which case the sequel server will treat it as if it were, no matter how many characters you enter there, so if you put 20 characters for your city, it is equivalent to having n VAR char 20 for that row and column in particular and you Well, great then Tim, what's the problem?
Well, there are a couple of problems. The first problem comes down to indexing. Remember here we talked about the fact that down here creates a couple of indexes for our address table and for our Tayla email address. based on the person's id and the reason they did it is because they do it faster or can do it faster. Let's start with this. It may not be faster to put an index on something that you are going to use in a query where your query had a where statement or something of that nature or you are joining it, what if we were to query a lot based on city, so that was in the where clause?
Well, let's create an index for that. not so fast because it's a n VAR char max, a non-clustered index has a limit of 1700 bytes, well let's let n varchar' max be way beyond 1700 bytes, therefore you can't put an index on this, therefore, you may have an optimization problem. able to optimize searches in your database based on your columns because your columns are too large to be indexed even though the data inside them is not too large, so that's problem number one, problem number two is one that is not. I definitely don't give it. credit to whom credit is due.
I was doing some research on this and just asked the question. Hey, is there a performance issue with n varchar' max because if we put 40 and you know 40 characters in there, it's no different than n VAR char 40? the answer is no, that's not correct, so down here Eddie did some work on this and figured out how to set up a bunch of different columns with a bunch of different sizes so that we have n varchars 64 to 4000 and then max and I same as varchar' up to max and then did some searching and found something interesting and the interesting thing is that the storage space is based on how much you actually use, however when you do a search you have to know how much memory to allocate before you receive your data , so with this you discovered that n columns varchar' and varchar' are supposed to be half full.
Imagine how much storage space and memory capacity we are going to take up for a column that is n varchar. 'max, especially if, coming back here, we'll notice that we have the state as n varchar' max postal address, zip code and he comes here, as well as the email address all n varchar' max, essentially any string we make will be n VAR char max by default, how much memory we are going to use. Well, he did some searching and made some inquiries. Now I will provide us a link in the description. You can go check it out and try it yourself and see if you get different results. but it did it several times and even rebuilt it with row level compression and the results were the same, so if we look at the NVR maximum characters down here and do the query, we look at the memory usage in 272 kilobytes, although here above we have the n VAR char 64 now the data is the same, there is no difference in the data that is in each of these columns, but just because the column type was n VAR char 64 we get a null value for memory usage because it is less than 5 kilobytes.
Actually, this is K. It's been one MB up here, but it had less than 5 kilobytes of memory usage, but down here, for the exact same query, just type a different column, your 272 kilobytes now use a percentage of 2 %, so we allocated 272 kilobytes of memory on our sequel server. for this query and only use 2% of that memory, that's a real performance issue because remember this isn't something that runs once, it's something that everyone who uses your application will run, so 5 kilobytes or less actually versus 272 kilobytes. that's at least over 50 times more memory usage the exact same query for one field imagine what would happen if you had oh I don't know one two three four fields but you had n varchar' max what could be the performance implications of this?
So the next thing I want to point out is that by default the entity framework does not create the ideal table layout for you, is the table it creates expensive and why? Well, this is not because the framework of theentity be foolish, in fact, it is quite the opposite. and the writers are very smart, so let me tell you the street address of the chain, how much data you can put in this property, as much as you want, there is no limitation here, it is huge, the amount of data we can put here, we can load a total.
JSON into a string variable, so if I were to store it in a sequel database, how do I know what to assume the size should be? There's no way to know how much that property is going to contain, so you have to cover all possibilities and that means you have to say varchar' and max well that's not ideal so we're going to have to change that to be a one of those things where I, the Box, will burn you out and be less effective and one of the things that, yes I've taught it before and people have rejected, is the idea that the entity framework is not efficient when it comes to data access.
I get a lot of pushback on that and one of the biggest things was Well, with the NT framework, our core has really changed, it's really efficient and my answer is no, it's not possible and if you write really good optimized code, you absolutely can do This is a true performance system, however, that indicates that you are an advanced level user who knows how to diagnose and improve queries in the entity framework, it shows me that the entity framework is not for beginners or intermediate developers, this is for advanced developers who have a firm grasp of the ins and outs of the entity framework because it's pretty Almost every demo you'll see online will look like this, since there you have the creature model and then the creature migration script and boom, there you have it, you have your application running and you have access to the data, but again the four and varchar. ' max is not a good thing, okay, next let's review here the person ID is a nullable int, I don't like it, now it's missing the key for the ID field, it has index on these two fields, that got cold here, it provides the same story that we've got n VAR char max we've got our identity column here we've got our nullable integer and we're creating indexes our index our key and we're a science that you can email addresses down here person again pretty much what same and then down Here we have our link, so it has one in the person, it has many in the addresses, it is the one to many relationship, which is good, it is ideal so that a person can have multiple addresses, but each address is only linked to one. person, okay, that's a one-to-many relationship, same with email addresses, okay, so we've found some issues in our code, but that at least guides us now that you have a better understanding of what this folder contains of migrations and how it works.
Okay, moving forward, we're going to update these models, we're going to make them better, but let's go ahead and create our database. We've already done that, so the package manager console again this time says update - database, press Enter and what this will do. Will you create our database now? What database do you create? Well, create the one we specified in our startup class. I'll do it in just a minute once this is successful, so the build is successful. Now it says done so we created our database so it popped up in our UI and said hey your database context oh you said you use sequel server cool and there's the connection string which is default, so let's open the application settings, dot JSON, find the default value, okay, here we go, let's create the database called EF demo DB in the local DB/ms SQL database, our server and where I use built-in security, which means whoever is logged into the computer, so EF demo DB, let's update our databases here and look for EF demo DB and notice. we have four tables, we have people's email addresses and addresses that look familiar, in fact, if we were to open one of these, come on, if you did it here, we'll see it in just a minute, but it has this EF migration history that in It actually starts with an underscore and what this does is it gives us a history of what happened in this database and it said, ok, I ran this migration and I ran this migration, so two migrations were run, this corresponds to both of them. migrations.
I have over here and so based on that, it knows where it is, so if I ran an updated database again, nothing would happen, nothing would change in my database because it already ran those two migrations and it knows, so it knows it's there. Well if we go back to Davis. look for any migration after adding age column again really smart tools but people table know id name which lasts longer than age again all null values ​​we still have no records in this table ok now we have our tables, but they are not ideal. it's not designed very well, in fact, if we look at the address table, go to the view designer and we'll see that there are n varchar' max for all of these really depressing, let's change that from n varchar' max, so let's start modifying this to that works better.
So first of all, the street address must not be null, let's invent the checkpoint required by the market to add our use of Steam here for the system component model data annotations. Now this will not be marked as discoverable. It's a good start. Now we need to mark it as not a maximum of characters n/bar, so let's say the maximum length will be and let's just say 200, okay, so I start with the address, let's copy that and go to the city that we require, but it should be a maximum of 100 for our length. The state of a state should be more like I don't know 50, the states I use for my demo or sample data, they are not state abbreviations, they are real states or you know names, longer names, so 50 is probably fine and then Let's do the same for zip code and postal code.
The maximum length is 10 because we use American notation, so it's going to be 1 2 3 4 5 - 0 0 0 0 something like that, that's our zip syntax plus 4, which is 5 characters, that's 4 characters and that counts, so which is also another character, so 5 plus 4 plus 1 equals 10 characters at most, our length should now be a zip code it's not Unicode, it doesn't need to be Unicode, there's no reason to store two bytes per character because again it will be the reason why the US is based here, but it will be a number from 0 to 9 per character space or a - that's it, they are all ASCII characters, so let's change it to varchar, well, what would it be like? you do it right, yeah, add a column checkpoint using the system component model data annotations point scheme and then you'll say type name equals VAR, oops, varchar' and then we'll give it our length of 10, so now you're putting in your c-code sharp sequel code our sequel commands are ideal, no, but that's what we have to do if you want to be more efficient in your database, which you do because your database, the faster it gets, the faster it runs, the faster it serves your users and here's one of those things that often annoys people with entity framework or entity framework or it doesn't matter when you're using this in development, everything is very, very, very responsive, in fact, we looked at that web page earlier and I'll bring it back. here and we talked about 272 kilobytes of memory, you may have laughed at that because, come on, Tim, 272 kilobytes of memory.
I have 16 gigabytes of memory on my personal development machine, that's absolutely true, if you have that amount you won't have any problems. development and in fact your server probably has even more than that, however how many users do you have or want to have in your application? If at first you have 5 10 15 20, it is not a problem, so these problems will remain below the surface until your application becomes popular or until you enter a production environment that uses it has many users and for many me I mean dozens or hundreds, now we're not even talking about thousands and millions, just dozens or hundreds, but once that happens, once you go into production and let's say a couple hundred people are using your app, a couple of hundreds of people times two hundred and seventy-two kilobytes of memory for one field times four fields, that's a lot, so suddenly now you're talking about bytes and gigabytes of data maiya. the fact that your users memory is used is a limiting factor in applications, so now you have designed the application and it is running and you have built it around the entity framework and it has worked very well to some extent in production, at that moment, what does it do?
Is it too late to redesign your app because it's already in production? It has already been written. It's probably grown beyond just small apps and much bigger now because it took some time before it really grew in usage. Now you have a problem because your app is based on a foundation that worked great in development, they were great in testing and worked great with small groups of users where they got big, that's when you ran into problems and that's it . We have to go back and do all of these optimization steps to hopefully make your application even more performant for its user load and just hope that the user load doesn't get to the point where other performance issues arise and cause it. we will solve. see them in just a minute, so with this we now have our address set to be a little more optimal, we go to the email address and do exactly the same thing, so first a checkpoint is required to add our user to it and wow it says the maximum length and let's say our maximum length for this is again 200, we'll still do n varchar' I don't know, email addresses can have a Unicode character.
I think they have to have ASCII characters. I'm not sure, but would we have to add something similar to what we did here to save half the memory usage? If we could set a varchar' instead of n varchar', but we'll leave it at n varchar', but I wanted to have at least one of these here to show you that this is what you have to do to make these modifications. Ok, person, it is necessary. In fact, I have to paste it here. Yes, I control the point here to add a user event. now name let's say it's Oh 50 and we'll do the same for the last name it's me 50 age it's just mandatory and then the addresses and the email addresses we'll leave them alone now that I've made some modifications to our models so we have to go back to the package manager console and say add migration, add validation and press go, let it be created once a build is going to create a new migration, we skip our migration script and it gives us this yellow warning, an operation was structured. may result in data loss, this is not a big deal if you are still in the development process and still building your application; however, if this was in production and you had a performance issue and you found out and said, "Oh, and varchar." max let's change that to and varchar' when hundreds and let's change it to you know, that zip code of varchar' ten that's great until you get that yellow message and then you start freaking out because you can't just run this update migration if Is it possible that you lose data because it allowed a maximum of varchar for all these different columns?
If let's say the first name that we have set would be 50, now let's look at it, the maximum length is now 50, well, what? if you already had someone's name there that was 60 characters maybe it should be there. Maybe it shouldn't be whatever is there because you allowed it to use the end of our char max. Well, what it's going to do is truncate that name to just 50 characters, it's going to cut off the last 10 and discard it, you're going to lose data. What if for our zip code we changed to varchar' when we were allowing n VAR char max, then if we have a zip code that has Unicode or has 100 characters again, we are going to remove or replace the Unicode characters and we are going to truncate them, just 10 characters will lose everything else, that's why you need to check the material before starting production because otherwise you're going to do a lot of testing first to see what data you'll lose and that means a lot more sequel work, so let's look at this script migration, alter the last name of the column in the people table, the max length is now 50, now some fake Nobles cannot be overridden and the previous type is n VAR char max, so as you could know, like this is what it was and this is what it is now and notice that it doesn't describe the type here because the type was n VAR char max the only change was the length so. it will still be n VAR char but the length will be 50 now let's look down the zip code again there you have the zip code this changed the type so the type is n varchar' max the old type says wow im I!
I'm down, sorry I skipped too much. I was wondering: here we go, so the guy isvarchar' 10, that's the new type and the maximum length is 10. The old type was n varchar' max, so notice that the new type is varchar' max. where we see that the change in the type not only in the length, so you don't see the type change, that means it is implied that there is no type change up here that is implied based on the old type of n varchar , it's just that the length has changed. Again we have our upstream script that will make these modifications and the idle script that will revert them now again revert sounds great, it sounds like we'll be going back to a point in time where you're worried, well no. a rollback means that it will change our database schema, our database architecture, it will not return the data that they deleted, so if you do something here and delete data, let's say you have names that are a hundred characters long and removing half of them from the rollback won't get those 50 characters back, they're gone.
You have to go to your backups, your sequel backups to get that data back, that data is fine, so be careful when you're thinking about rolling back. it's just a schema revert, it's just reverting our sequel scripts. It's not returning data, so if your migration deletes data, it's gone. If your rollback deletes data, it disappears. Whatever you do, make sure you are going to lose data. you know in advance and you either change what you're doing or you're okay with losing that data because once you run the migration either way that day is over you have to listen to your seagull backups to get it back so now we have this, let's update our database and yeah, we don't care about our data, there's actually nothing there, so we're ready, we did a database update and now our database, if we go and look at it, Let's see the directions okay. come on designer and our designer says n varchar' 200 150 and varchar' 10 great, we still have that nullable int that we can fix, not sure if it will, is it video or not, basically you have to do that's what you have to enter here to addresses, you would have to create an id and say this is the person id and then you have to make it non-nullable, but we're done when I'm going to do that like Well, this is not so much about teaching you how to use the framework entity framework, but what are the best practices for using the entity framework, what to consider and how to work with it.
Okay, now that we have our database set up. ready to go, let's use it, what I'm going to do is minimize some of these things so they don't get too messy, then pages here for our web demo on the index page we're We're going to put the page model, we're going to put some code to actually work with the entity framework just to show what would happen. Okay, so the first thing I'm going to do is import some data here, some sample data just to make sure we have something in our entity framework database. Now what I did was use a service where I can create temporary JSON data.
I'm going to incorporate them into our application right now. This generated JSON, you can have this, this is the web generator. and what this is is a scheme that is based on my models and it's just random data, so I said: "give me a random first name, last name, an age and give me three random addresses and four random email addresses per person and I asked 100 of these." objects, so now we have a hundred people with all their information, okay, that's just to give us a little bit of sample data, hundreds, not many, but some, so it depends on the private ones, override the low sample data and this method will run pray all the time, but it will only run if the database has no data yet, but that means I don't know the database well.
I can go up here. I can say give me a people context or that's the context for our database checkpoint. to add our using statement, I'm going to call as DB and control point to create, initialize my field, so now I have dependency injection. I have my people context, I remember again and start we add the database context to our dependency injection and say. it's a people context, so you never asked for people context, you get our context that has access to our three tables, so now download sample data. I can say yes and that is why aunty editors are so popular.
DB people point count equals 0. I just do it. I just queried the people table and said, hey, if you don't have any records, do this right with that line of code. I didn't say open a connection, nothing, I already said, you know what, go to the context, go to the people. table and give me a count and that count is equal to zero, we're doing things so that's really convenient. I'll show you the downside in just a minute, so let's start a string file equal to system, dot, IO, dot, file, dot, read all the generated text. dot JSON and that is this generated JSON file that in the path we go to our properties to make sure that it is actually copying its newest copy, so this will be in our output, our build folder, so now we have read file all text in this variable again remember that a string variable can hold as much as you want so the string variable file has all of our JSON generated.
I'm going to save our people equals JSON serializer checkpoint add using JSON system text JSON serializer point deserialize to a list of people from the file and person checkpoint add our using statement there, so that's good get a list of people from this file, gandhi serializes it, which means it is also going to deserialize the addresses and email addresses because the person has a list of addresses and a list of emails, so now, with luckily we have 100 objects that have been loaded with all their email addresses and addresses, so now I can say in your main database d range people and don't forget the last thing you need to do is say DB dot Save changes now .
I'm not going to do a sync. I'll do it directly. This is just a quick and easy demo, so what's going to happen is we're going to load. this temporary data from the JSON file, if there are no people in the database anymore, which there isn't right now, let me create it and then we'll deserialize it into a list of people models involving people, we'll add that list to our database and then from there we will save those changes now how do you know where to add these values? Well, person, I know that person is and the person has within it an address and an email address, so it's going to take care of adding all that for us. so I ran an insert query on a bunch of items and I'm going to save them all in the database to get a hundred insert queries actually more than that because each person has three addresses and for email addresses, it would be three hundred. four hundred one hundred, so one hundred embeds, all with links to you know, so the addresses you get, but then you have ids that link to the person, that's a lot of complicated stuff, it was a really simple way to do this , OK?
Let's call this method, we can call it every time we load the paid index page because it will help us verify now. I will show you the sequel to Server Management Studio. Why am I showing it to us now? Well, because I want to monitor that database, so I'm going to go to the standard things here, in fact, I'm going to limit this. I'm going to say you know what gives me just full batches so let's right click here let's say filter by US value and hit OK so now it's just full batches and now notice there's a bunch of entries in right now, but if you could go to let's go update this in case it doesn't update, it's our EF DB contact note there's more going on here and I had to make a new query window and it said select DB star or point contacts , that is not right.
EF demo DB, that's why EF demo DB, how was it? There we go, so select the D star, not people now to run this statement, come back here and if we look at our list here, scroll down. I'm going to see this entry right here now it's already trying to get away from me, let's press stop real quick and let's zoom in on this because it's a very small star that you see from DB without people, so it captured the command that I just ran well, so with that we can do a little research on what is happening in our database.
Now I'm going to start this again. I'll filter it by database and by sequel batch. completed, so let's start this again and again, this is a little more advanced sequel, maybe you've never seen the X event profiler. If you want to use the entity framework core or the entity framework, you value that you know it because it's important when we're diagnosing what EF is doing on your server to know what queries it's running, this will tell you, so let's configure it again, it's starting to run and it's a little bit slow, a little bit lagging, okay , let's run this query. again and there we go, so now we're going to change, change the columns here, so delete this column and we'll add a column, we'll delete the customer name and add, select a record down here, you can go down to the values. down here that's like the name of the database, right click on show column in the table and then I'll do a filter.
I'll say filter by so right click on this and say filter by this value and now it's EF demo DB okay there's just the EBS donut and let's get rid of this select @ @ SP ID so the filter buzz value and grass they say they are not equal, so again it's a little small, sorry, let's say text is not equal to @SP ID, but the database is a demo database. and/or eft-1 DB and the name is batch complete, that's the only event I want to listen to, okay, and now it will leak only to the demo database and we have at least a few less commands and we can clear data . and start over, but let's close this first and filter out some more because we want to make this as clear as possible and not have anything extra here if we can help it, so I'm being nice. of filtering out the ones that are pretty common, they keep coming up again and again and they say, you know what, no, I don't want the ones that don't want them, they don't want those until I get a more reasonable list, okay?
It's still a little long and let's get rid of this one, let's get rid of this one and let's get rid of this one, obviously it's not the same as all of these, so there's my and the transaction count will leave me good, so there's what I have right now. The commands that I have done now what I am going to do is start my application now. I'm going to start it off screen because I want to set a breakpoint, let's start on the screen we're on. a breakpoint right here, so let's start this and now what's going to happen is when we load the page for the first time it's going to hit that breakpoint, okay, now come down here and notice that there are no events so far, they're showing up with a screen we are looking for.
We haven't had dinner yet, but let's ignore this head count. Well, it took us a couple of seconds and now if we look here, we'll see that we have a new command and this new command says "Let's zoom in here so you can see it." There's the Select command. count people stars like P, that's the command that the entity framework just used to find out how many people or how many rows are in the people table, that's a really efficient call, now you can say, well, count stars, don't use count select one, well, and that's my default, I selected count one, but actually because the sequel has gotten better and smarter, selecting star count is no different than selecting count one, so it's just as efficient, now we can leave this and we have some inputs. here logical read rights and duration of these numbers are not really as intuitive or as useful as you would like them to be, but they do give you some reference when you compare them to commands, so they may not be able to go oh. that's something that took so many resources, it's kind of a made up number, usually those numbers, but they're really useful for comparing one command or call to another, so now that we have that, let's go ahead and run all of this. press f5 and run all of this we will see what happens with our command window so it continues to create 100 records and see what happens in our command window and nothing happened and I think the reason why if you are a filter right now is filtering just by batch completed, let's pause this one for a minute and actually delete, let's delete that clause just for a minute, okay, and now we have some additional commands, we have some complete RPCs and the reason is we havesome store procedures, so the entity framework calls store procedures.
You can say it's okay because everyone says the store procedures are great or at least Tim does, and then what are you doing right? Let's find out, so let's copy the statement and a copy should copy the value, so copy the sale and then I'll open notepad, my trusty notepad, if you were here, I'll paste it there, wow, so what are we doing here? Well, we are inserting into a table, we are merging into the people table. and then a bunch of other commands down here and these names are not human friendly so the values ​​p0 p1 b all the way to the end are 280 not 299 as values ​​so it's not ideal what this does is insert this batch and the way it is done.
Is it creating a store procedure or a call that has many parameters and then pasting all the values ​​in 299 of them or actually 300 ways? I think the first one is 0 and p0, so 300 values ​​are entered at a time and sent to sequel, but it didn't actually create the save procedure and that's what we normally do. We would create a save procedure to do this bulk insert, but this is a one-time thing, possibly or maybe it happens all the time. So, I don't know what he's going to do instead. This is what happens with all inserts. This is another thing I don't like.
It says run SP underlined. Run sequel. This is a storage procedure you should never use. Ok, don't use this and I say never. that's a little ironic because yes, there are exceptions to every rule, but I want it to be an absolute exception, don't use this, this is dangerous now. Anthony framework is not stupid, the developers are not stupid, this is a very smart tool. The system I said before, the tools are great, so why use something that Tim says to never use? It's dangerous because the entity framework has a closed system, so we talked about the idea of ​​sequel injection and not allowing users to write directly to our sequel. database because they can inject their own sequel, many dropped tables or change permissions or cause chaos in our database.
This execution sequel does exactly that, it allows whoever passes in whatever type of string they want, allows them to just execute that string and it will execute that one. as a sequel statement now, since Anthony framework created this call, it is safe to call this execution sequel because Anthony framework knows what the data is, it is protected against injection attacks, it is protected against bad data arriving and we are actually creating ourselves in hopefully a development environment and testing this as well, however, for the sequel or an order for the entity framework to call this sequel, this is the SP underscore, run the sequel, the user who is in your connection string remember I said right now that it's a trusted connection, which means I'm logged in, those are my credentials, but in production we'll probably use a username and password.
Those credentials will either be a logged in user or they will leave us the username and password that you put in the connection string. Those credentials must have the ability to run this SP underscore. Run the sequel. What does that mean? Well, it means that if you have a desktop application that uses the entity framework and you have to put the connection string on your computer somehow, I mean it's almost impossible not to, you can talk about encryption all you want, but the user owns one. end of that encryption, so it's practically useless to encrypt it, it's practically worthless, so if that user owns one end of the connection, they have the ability to use those credentials without going through your application, so, you've given that user who has your WPF application or your you make your P application or whatever you've given that user access to run the SP underscore, run the sequel on your sequel server and do whatever you want .
I don't like that, I hate it, so keep in mind that that's a danger right there, that's one that My big dislike of an entity framework is that you can't really lock your database, it's pretty much open, those credentials They have practically complete access to your database. I don't like that, I don't like giving the user access to my entire database and saying please don't screw it up, okay, obviously your typical end user will have no idea how to get those credentials, especially if you figure them out, but not everyone is a typical end user if you have one person, a disgruntled employee, someone who is just trying to play with a hacker or trying to be a hacker, they can ruin you, you are not a fan, okay now if you mean to web applications, it's a totally different story because web applications you put the connection string on the server and usually your server is totally blocked. from standard user access, yes your server admin still has access to your server, but your admin probably already has the keys to your sequel database anyway, so it's a different story on the web , but especially for desktop apps this is a little more dangerous, so if you watch my store procedures video I talked about using dapper with store procedures and how you can block access to the database only for the store procedures, not to run SP, don't select star from table, don't even give a list of tables, just call the store procedures your app needs, so if the user gets your credentials, no big deal, You only have access to what the application already gave you access to, now you may be able to bypass some of the restrictions that your c-sharp code sets like a name should. have five characters, okay, they can probably prevent them from calling the stored procedure directly, but that's not as important as being able to delete tables or read personal information that is not yours to read well, so that's the bad side, now the good .
The side is that we have reduced 300 parameters, what are those 300 parameters? Well, if you really look down here, you'll find them, so here you go, p0 is 33, p1 is Stricklin, p2 is Boone, etc. and it loads, everything is up and with us. doing so is doing a bulk insert, so multiple rows are inserted at once instead of just one row per call. This is more efficient, so there is an efficiency benefit to doing this, but it requires using the SP underscore, running the sequel now with only 1 us. We still have the other two, so I have this call right here, that if we copy this cell and get our text file back, check and delete everything, paste it back, it looks something similar, we have a bunch of parameters, in fact, parameters that They start with the number 300 and continue. via, Oh my gosh, this happens looks like 2397 again a bulk insert and it looks like for email addresses I think the other one was addresses and then we have shouts, we have one more and it's this one here and this one if we control a and paste them into this, It seems like this is the people, no, these are email addresses that combine the email address there's still another combination in there and yeah, you know this is fine, so again, a lot of insertions going on here, a lot of things making up the incorrect, hey, come on. close it, there are three calls that you made now, if I were doing us elegantly, I probably go through all eight hundred rows and I do one at a time, so 800 calls versus these three calls is three calls better than 800 calls, yeah, it's probably massively . better not as much as it seems because it would send about the same amount of information over the wire a little bit more because let's be honest, I'm going to insert into that statement and put the values ​​that we had called that statement over and over again. and again, well, I'm calling insert in person these four columns, I'm calling that string and passing that string, you know, a hundred times and they're the same for the four hundred times four addresses and the three hundred times. for email addresses, whatever they are, so yeah, there's more data that I'm going to transfer over the wire to create 800 people compared to this call, that's the downside of coming to Dapper.
I work more now. I could create a store procedure that allows for bulk inserts which would greatly improve my efficiency and I could do a bulk insert and do something similar to this without the SP underscore execution sequel, but it requires me to write that insert statement in the bulk insert and they also create a table value parameter or maybe three of them, one for addresses, one for email addresses and one for people, so yes, more work is being done on using Dapper, so this way this is a little more efficient. How often do you do bulk inserts near C-Sharp applications?
Not very often, if so, I would suggest writing optimized store procedure inserts with bulk insert capability and they will be more efficient than even entity framework code, so it's up to you how to do it, but in this case, if We compare apples to Apple, the simple way of doing things, the entity framework would be more efficient at insertion, but it opens us up to more security implications of using the SP underscore execution sequel and requiring that capability now that all of our data is there, so let's bring back Visual Studio and stop our query right here and then load sample data which I'll leave there.
I'm going to ask a very simple query. I'm going to save our people equal to DB dot people. Now these people will only load the people table, but I want the people table to include these addresses and email addresses, which means going to these other two tables and loading that here is where a T tools version really shines Framework because I can just say include checkpoint to add our using statement here. I want to include I can say the function was, dead addresses and then include email addresses e e dot and at the end I can say list what this will do is what will go out to the database it will look up all the people it will connect to or will connect. addresses and email addresses and give me the full result set like people I'm doing now.
I'm not doing any filtering here. I'm just saying y'all give me each of a hundred people with their addresses and associated email addresses. the target list is where the query is actually executed, so the target list says: ok, this is a query so far, but now we actually want to push it to the client, we actually want the data on the client side, like this who is trying me again. the tool usage is amazing, it's as efficient a workout as possible and so if you were to make a where statement here, you would execute the where statement on the sequel side, not the c sharp side, although it looks like it's on the C sharp side, so I'm using the link here, but the link works while creating a query for the server, but let's start with this really simple query.
I'm going to say load this. Good. I'll put a breakpoint right after it and let's run this again. press this people count and it will move us down a little bit if they go into the people count and say no there's more than one so don't run this again so again select the people count that you know as that call over there, let's get closer. here's the count again and it says we don't re-insert because it's not necessary. We have already inserted enough records. No. We have more than zero, but now we are going to get it. this full batch maybe use full batch because when it's full is when it gives us all of our memory information, so a full batch in this query I'll copy it and we'll paste it into notepad in just a minute, but come on. down here and look what we have, here we go, we have the information for this particular realm, so we have the duration of eighty-nine thousand 533, that seems high, especially compared to our selected star, which was only forty seven hundred and ninety and two again. of made up number, but it gives us a comparison, this is a lot more work, logical reads 331 and row count, the row count was one thousand two hundred and four, what does that row count?
Well, that's the number of records it came back to. our sequel server or to our c-sharp entity framework and if you've been doing the math you can say no, Tim, it returned 100, so why does it say 1200 and four? Well, because I actually returned 1204, let's start by first looking at the people themselves and make sure we actually have the data we need, so let's pull one at random, just the first one we have ID like the first in a Strickland last name is the address Boone's email address, there are four in there and let's just set a value here, since we have something, here we go, there's an email address that's great and our addresses we have three of them, let's just set the city listening status , thisokay, we have Martinsville Waterford and Waikele, so again, the Federated States of Micronesia, okay, not in the US, but okay, but it looks like it's all the right data, it was very easy to do.
I mean, my God, that's all I had to do. I recovered 100 records with populated models, that's awesome again, it's better than fancy. Dapper, we'll take a few different calls to populate all the values ​​here. I probably made a call to people, two other addresses, two other email addresses and then linked them using the link and then sent the dice back, so we've had three different calls and we've received all of them. logs possibly or have done slightly more complicated queries, but I probably could have focused a little better, but in three calls and then a bit of work on the sustained C side before I could get to this point, whereas this is only four lines of incredible code.
Let's stop here and talk about this query. I'm going to copy it again and I think I would copy it once. Go to a new query window and I'm in the wrong database. Let's go to the EF dB demonstration. Here we go, let's hit it. our query, well, it's all on one line, which is not great for formatting, but let's fix it, so we'll format it like this, so it looks pretty good, so here's our query. We have selected that brackets do a lot, okay, it is actually recommended. but I asked ID age first name last name ID city person ID state address all key bits all columns of all tables and it says people like P cool, rename it P to make it shorter, that's awesome.
I told you this tool. it was great, let me leave the set now, if you are not yours, a sequel, a left join says give me everything from the people table and if you find matches in the address table, give me those values, otherwise leave them as null, okay and The links in the id equal the person id, cool and the left joint here too, do you know what's the problem with this query? It will give you a little hint for one person, so one person here, how many addresses do they have? Is it one or is it?
It's more than one, well, it's a one-to-many relationship, so there are many addresses for one person, so what happens when you match the ID or the address of the person that would be and you say, okay, ID number one? go to the address table, it says oh, I have three addresses that match that person's ID, so how do you know which one to put in the queue? What it does is duplicate the person, place it three times and put each address differently. in a different row the same thing happens with the email addresses, but what happens because now we have multiple addresses and multiple email addresses?
How does it all work? Let's write and find out 1200 rows. Let's get closer here. Okay, so we have Strickland Boone, there are 12 records for Strickland Boone, so Strickland Boone's ID is returned 12 times Strickland Boone's age is returned 12 times Strickland's first name is returned 12 times his last name is 12 times okay, now we have your city four times a drop is our city four times and your other city four times the same state person ID street address a zip code they are all duplicated why are they all duplicated? Well, different email addresses, so we have four different email addresses for Strickland across those four lined up with each address, okay? there are four, there are four and there are four, so that's what happens as you have more relationships with this left join where it's a one to many relationship, what it's doing is creating duplicate records.
Now you didn't see this in C sharp, so what? happened well, what c-sharp did was I said, you know what I probably can't write a more efficient query for this, so I'll do it and then I get there's all this information. I know there will be tons of duplicates and I'll just delete the duplicates so it's like oh you know, Strickland Boone. I have that person 12 times, it's probably a record and Boone affected, 33 years old, ID, a great one, let's create that object and then I see three different addresses. create one for each of them and I see four different email addresses so I'll create one for each of them and so on so it takes all this data and compresses it into 100 records with objects attached now if I were to make a separate one query where I said give me all the email addresses give me all the addresses and give me all the people separately in three different queries like I said it would have to do with dapper it would have 100 people it would have 400 addresses or ze no Sorry 300 addresses and 400 addresses from email like a total of 800 roasted entity frames returned 1200 rows, you can say well that's not a big deal, it's yeah, it's a little bit more, it's 50% more, but it's worse than that because when you would do a query for the people.
You would get this record once. ID, age, first and last name. You would get that record once and the framework has transmitted that value twelve times over the network, although it only transmitted 400 more rows. has carried a lot more data, so if in my little example we go from 100 records to 1200 rows fully populated with all the data, ok, notice the noise, no one really knows, they are all full, so 1200 full rows instead of 800 smaller rows, okay, there are a lot. more data transfer over the cable now again this is a small example and it still has great performance it is very fast to run and it is not causing my network to have problems because 1200 is not a big deal but imagine what would happen if If you had five or six different lefts come together here because each one of them is multiplied against all the others, so let's imagine for a minute what this is.
I have notes on people, so every time I meet a person I attach a note to their account so I can say okay, you know, I met with Bob, we had a great talk, you know, you're doing great and then , the next time Bob Bob really struggles with you know if statements in C sharp, I would leave him and I have hundreds of notes for Bob, now I do a quick search and say, "Okay, give me all the people and let's move on Let's go ahead and attach our notes too, which could be thousands, tens of thousands, or even hundreds of thousands of records for just a few hundred people in the group." real database which is a real performance problem now again, you can say well, yes Tim, but there are ways around it, in fact, if we go back to our query, if I don't need the addresses and the email addresses, I can comment them and we don't include them if we run this now and look at the list once it loads, it won't have any email addresses or addresses in our list, okay, so let's look here at the zero address count, the address count of zero email, didn't load them now. if we go and look at the profiler and go down here and look at what worked well down here, it said the row count is a little bit small, but I tell you the row count is a hundred, so it only took the hundred rows. which just took, let's just copy it and don't query and run this program from Early, it just took these things from the people table, there are no inner joins, that's much more efficient, so if you don't need those addresses and the email addresses email, don't include them, that's a big deal. deal with things with the entity framework, but if you don't know that and you say hey, I want when I fully complete the object now to do things right with it, that's a big performance hit, so let's talk about the difference , you know, a big fan because the elegant Performance One of the reasons it's effective is because it puts in front of you every time exactly what you're doing, the only way to make that decision. that returns 1200 rows, if you do that You intentionally have to say yes, this is what I want with the entity framework, it's kind of hidden, you don't need to know what's going on unless you really understand how the entity framework builds its queries and this. is a very, very, very simple example, so again this boils down to this will work fine in development, it will work fine until you put it in a production database with a production environment where you have multiple people, because even if you have a production database environment where maybe you take a production database, make a copy, clean up all the sensitive information and use it as your development database, which is a good thing and if you do , you will have many of those links.
You have a realistic idea of ​​what these queries are going to do, but if you're in development you're probably only running one query at a time, you're probably only running one person at a time, but if this is on your index page. of your site, maybe your home page set has a list of all the users who are using the system and can click on one, which gives you more information about that user, that means everyone who visits your site is running a query, this is why you will see big loading times on web applications, so if you are a university or school and you have a registration period, so registration opens on Friday, make sure you get the class you want, guess what's going to happen?
You'll get a website honored at the same time, so maybe all year long it's not a big deal, if you know a small percentage of your students at any time during registration time, you have all of your students at the same time. So if you have 10,000 students as a school, normally I would only say three to four hundred users at a time, but during registration you have, say, 8,000 students at the same time if that requires a little bit of heavy performance like it is now. by uncommenting these two lines that will crash your website or possibly see you increase the likelihood of your website crashing, add that to the maximum of n varchar and you will really run into memory issues, so this is where you need to understand how the framework works.
Anthony. is building the migrations and you have to understand how the entity framework builds these sequel scripts to make sure your application works. I can't emphasize it enough. All of this will work well in development. Only when it reaches production can it be done. You are going to have a problem and let's say you are a college student. I used to work in college if my web application crashed during registration. I don't have a year to solve it. I have to solve it in a matter of minutes. I have to do something to register. working or we are losing thousands of dollars, we are causing massive problems, we are blocking help lines, we are causing students to become frustrated which may mean they don't come back, there are many repercussions that happen because my software failed at a critical moment , that's a big deal, that's a big, career-threatening problem, so plan for those cases, don't just plan for the fact that it works great in development as one more thing I want to show you that I just crossed off.
The surface of how to use the entity framework and what things to look for, but there's a lot more to cover, but I think we've gotten pretty far in this video. I want to show you one more thing which is an improvement in the tools again. the tools are awesome this is an improvement of a tool in entity framework or I think 3.0 was when it was improved and now it's 3.1 so let's say I have a method and let's call this method private well it's a yes , pet bool passed age int age, okay and it's going to return that age is greater than and equal to 18 and age is less than or equal to 65, okay and that's basically the age between 18 and 65 or not, if returns true, if not false is returned then that is the pass.
H, now this could be put in a query parameter, but I'm going to put it in a clause where like this where the arrow function es I want to limit this list of people to only people who have an approved age between 1865. Now again, I could have said that the age of point the difference between those that appear on the invoice? where the arrow function X the age of point X is greater than or equal to 18 and the age of point Well, this seems a little easier to read instead of putting up our review. just online we have broken it down into a cool method, not cool, not for entity framework purposes for c-sharp, not a problem for entity framework, this is a prop and why is it like this where Klaus can become the sequel to T and run. the sequel server remember all this code here is trying to run the sequel server and then a list actually shows the results, however this code here is c-sharp code, how would you run the c-sharp code on the server that actually I can't, then what would happen is what happened in a frame ofentity six and this is actually a big problem.
I mean, yeah, no problem, let's go ahead and do it and it would run that approved age in the where clause. and they're usually great, those two lines are equivalent, no, what would happen is they came here and said, oh, I have to run the c-sharp code in my query, so do this part of the query, download it and then run this method here as well to filter. the list, so if you were hoping to retrieve, let's say 500 records or, sorry, 50 records per hundred users, if you spend 50 users and recover 50 users, however, the sequel call and download would download 100 users in total. these are and then filter them, so instead of saying 600 rows in total, remember that it makes 1200 rows for 100 years instead of 600 rows for those 50 users, you are downloading the 1200 rows only for the 50 users because you are faced with the elimination. half the data Anthony framework 6 wouldn't tell you that this is a problem, let's see what happens now he is right now he's not telling me that this is a problem ok, I think it's a poster.
I think there is a way to activate it so that it tells you. and it may be in the tools options, let's go find the entity framework, no, premium features, no, not here, it may be in my preview version of Visual Studio. I'm not currently using a preview version, this is what a verification is, this is sixteen point four points. two, okay, currently he's not telling me there's a problem here, but let's run this and see what happens. There are no breakpoints here. Actually, it will be executed. You know, I'll put a breakpoint right at the end, hopefully, right there before the breakpoint hits. come up, no, the link expression could not be translated, either rewrite the query and a form that can be translated or change to the client evaluation explicitly by inserting a call to as enumerable as a sink in we are a bit to enumerate or enumerate a sinking well, what is that?
This cannot be translated and as a user you may not know what that means. You can say what you want to say. You can't translate this. Let's do this. I'm going to put my semicolon here in a comment. Now I'm going to put a list here, that's what I said. I make the list before the Where clause. Okay, let's get this straight. Okay, we start over, so let's run my breakpoints again. place, I moved my list and let's run this and see what happens, okay, I can break the point, great, it worked like a charm, come here and we have a batch here, you know you have our counts, so you select the count and then we have a only. batch and notice that the row count down here is 1200, so what you did right was exactly what I said: you would do it, you downloaded all this data and then you did the query based on this class where, therefore, people are well, people have to expand this. people have, let's look, it seems like there are 55 people in it, something like that, it's innumerable.
I can make two lists, so I could have made us one at the end, that's the price, but it only has 56 people, including zero 56 people. on this album, well, that's different from the 100 I expected. Sorry, it's different than the 100 I downloaded, but it's what I expected. I was expecting 50 people because that was my range that I filtered out, it's probably in the middle and middle. of them are in that range so I got the values ​​I expected and so if I didn't look at the sequel query and see what it was doing I wouldn't know that I just made this horribly inefficient query and it even did what the exception said to do and therefore made it horribly inefficient, okay, now let's get rid of this to list it here, let's uncomment these two right here and comment out the bad one, okay, let's run this again, so I'll run it again and I made the list by final this time, so I should have my count, so the count is 56.
Great, if I move on to the sequel, come down here and let's zoom in on the row count, my row count is 673, that's very different from 1200, Alright. almost half of why is almost half because we only had 56 people and those 56 people had the four email addresses and the three addresses, so that's less records than the 100 growth, so by changing my where clause or by not doing so, I can have a more efficient query, so just because the exception says "hey, put it in the list before this where clause", don't do it, not if you can avoid it because you're dropping all these records down to C sharp to then do the work if it can help you let seek do what sequel is best in sequel is best for storing and retrieving data yes c-sharp can do it and yes there are times when c-sharp needs to do some filters on its own, however, most of the filtering and querying work must be done on the sequel server, so if you can write your query this way instead of a separate method, it's important that you do so. because therefore it will make this query on the server.
In fact, let's go see. In that weird, real quick, I'm going to copy this and then open notepad again. I closed it, so let's open it again, let's paste this in and say we have word wrap now Warcraft or not, so there's our, let's just expand it. we arrange it ourselves like this and there is a select statement there is our from there is our left join there is another left join here is our where clause and there is the order, so the where clause where the age is greater than equal to 18 and the age is less than or equal to 65 great so we took our c-sharp code and translated it to 80 sequel queries on the server so we didn't have to download all those logs so one key thing is that the best practice here is that when you have your statements linked, don't do it.
Don't call C-Sharp methods, put the code directly in the Where clause if possible so it can run on the server. It should now tell you in EF Core that there is a problem. If you try to make this code in Antony Framework, you won. I'm not telling you that you're just going to run this code by first downloading this data and then running it with this method, so that's where you really need to understand how to look at these queries, see what happens, and make sure that's the query you want to run. So in summary, you need to know how your migrations are working and make sure the migration data is correct and efficient, so you need to make sure you decorate your models appropriately, such as changing to varchar' and sub and varchar' changing the maximum length that makes them necessary, it's important to do these things, well that's from a design perspective, the next thing you want to do is make sure that when you write your queries you don't include these things unless it's absolutely necessary and you know the implications of using that inclusion that was going to cost him.
You should also make sure that you don't call the c-sharp methods in your query until after you download the data, but keep in mind that once you have downloaded the data. Redownload larger chunks if you only filter after download, so if possible filter before downloading, that's fine, so you also need to know how to use X event profiler or something similar on the server. sequels or some way to know what commands are being executed. run on your server how efficient they are and what you can do to improve them because it is very very important now let's talk as we wrap up here why would you use entity framework what are the benefits of entity framework so no preparation here on the comments. benefits of entity framework touch core, I mean it's the ng framework or entity framework core, it's up to you, so the first benefit I hear a lot is faster development speed, ok, I understand obviously this It was super quick to make, and I got all the miles, it just worked, it was quick to develop but it was slow in production.
Yes, there are things you can do to improve performance, but that means writing your own store procedures, which defeats the purpose of doing this or you can't include some ladidas. which again negates the benefits you get from the entity framework or you just live with the fact that yes, it's more data, but hopefully we won't need that extra performance, then you'll never want faster development speed, absolutely, it's fast get the entity. framework running however as we have seen it slows down a lot if you want to make sure it has performance so yes you can shoot yourself in the foot quickly but maybe you shouldn't shoot yourself in the foot maybe you should check what your creation. scripts are like your table layout and make sure it's correct, that takes some time, maybe you should decorate your objects and make sure they have the correct decorators, that takes some time, now they've lost some of the benefits that They got from the entity framework because they are trying to have better performance here, so faster development speed, yes, but if they want to have good performance, then that development speed is greatly reduced.
So the next benefit I see from people is that you don't have to know the sequel. Well, I think I've proven that's not the case in this video because if you don't know the sequel, you're shooting yourself in the foot. Would you have known to come in here and would you have known what n varchar' max meant or would you have known that it works? You probably know it works. Would you know what this table means and would you know that Oh, 272 kilobytes is pretty bad and we shouldn't make a change, so we wouldn't use it anymore and instead we use, you know, n varchars 64 or 256.
Instead, I probably wouldn't do it because I wouldn't even know how to do it because I wouldn't understand the different types of data. on the sequel server, so if you don't know sequel, would you know how to use the X event profiler to see how many records it is downloading from each query or how to optimize your query by looking at this and making adjustments? your c sharp wouldn't know how to do that either, so if you don't know sequel, you really shouldn't use the entity framework because you'll get burned now again if you're going to make small apps if your apps aren't going to production if you're making big apps. demo or apps for very small businesses that have very few users then you will probably ever see a problem in the entity framework, but if you ever intend for your app to grow to any size it will be a problem, you will have to optimize it , which means you will have to learn the sequel and you will have to slow down the development speed.
Now I mean talking to the fact that our scripts, let's say your validation script here, remember we ran this, in fact, I instantly came in here. Yes, a scaffold operation could result in data loss. Review the migration for accuracy. How would you do that if you don't know the sequel? How do you know what is happening? I wouldn't know if he knows of a sequel. What's wrong with it? one thing to do to evaluate this data loss, go into some queries, we are truncating these 10 columns, is there data in those 10 columns that is growing into the header of the trunk? had to go to man studio sequel server run some queries, figure out, figure out what to do with the data and maybe make sure you have good backups.
Have you done any of that? Probably not, if you don't know about a sequel and even if you did you've now slowed down your development speed because you have to be responsible with your data are fine so I don't really see the number one benefit in being so valuable to the game framework. entity, it is absolutely faster to develop and you don't need to know the sequel, but as we have seen, if those two things are true, then your database is in danger I don't like creating relegated databases that are in danger databases of production data can have thousands, hundreds of billions of records if you are not very careful with how you perform queries, if you do not optimize the performance of your queries then you are going to slow down your application and it will happen in production in the worst time, whether you are an ecommerce site and you have a sale and that is when your website crashes or you are a university and it fails during registration and there are many other possibilities based on your industry, that is what will happen or what that will probably happen and if that happens then you are on the hook and there is no quick fix now, you can say Tim, I have been working for 20 years I have never had a great problem, that means one of two things, one you are dedicating a lot more resources and then you need two or two, they're not that big, okay, one of the two is a case, either one doesn't do tons of queries and doesn't retrieve a lot of data or allocate a lot of resources to their servers to keep them up. performance, neither one is terribly efficient, okay, either one isn't working much. in which case you know it doesn't work or two you are dedicating a lot of resources that you don't need and therefore you are paying for them okay you are paying for the resources that you have if you have your application in the cloud then you are a bit more visible as to how much you are paying for your app, ifis local, then you're probably paying by buying larger servers and you actually need a lot more memory than you actually need and you're having a lot more power usage from those servers, so you can get by and never see a problem, but if you do , it will happen at the worst time.
Now you've heard me say that I like Dapper and if you've seen any videos. I prefer dapper, okay, so let's count the benefits of dapper goodness, there we go, so benefit number one is faster in production, that will be just one case, okay, dapper is practically as fast as do dotnet. a do dotnet is pretty much bare metal calls to the database, so you're not going to be faster than graceful in almost any case, now there may be cases where you get the entity framework as fast or practically as fast, but again, that's if you slow down development and don't make any sequels and you know. how to optimize your entity framework or if you are an advanced user of entity framework core again if you are an advanced user of entity framework in core and you know how to optimize it, use absolutely any framework or is it just that people don't does Kamala door knowing the core of the entity framework there are huge books there are tons of video content there are many blog posts on how to improve the performance of the entity framework how to use the entity framework in general and how to work with this within your application the reason why there is a lot to know if you know it, if you are excellent at it, if you are an expert, great, you will do a great job and achieve elegant performance.
Yes, development will be slower, but these will be faster in production, you will know that we have optimized for production, so you don't have to worry or worry as much about the scale your app reaches because you have already optimized the production part. database of that scaling. You've already previously optimized your application for database performance because you want fast access. rapid production development speed occurs once production speed occurs every day. I value production speed much more than my development speed. I love how quickly the core entity framework can be developed. I want faster production speed for my application.
Well, it's easier to work with buffer for a sequel. developer, you can in about four lines of code call a store procedure with a parameterized sequel to your data very quickly. I have a little snippet that I created that actually becomes one line of code, one line of code and I get data from the database. I put it in a model and I have that list of models on my computer, which means I have to rest or do a procedure or I have to write a sequel T statement and pass that, yes, that means I have no sequel, but with that I can be faster in production and I know exactly what day is coming.
I know exactly what my query is doing performance-wise and I can make a change very, very easily if you have to change this query. be more efficient, what do I do? Seriously, you'd have to rework how this works, you'd have to make some kind of weird changes to the bindings to make this different, that's not ideal because you have to dig really deep into the binding and entity framework to understand how it works. . create queries and make some changes if I am Raina store procedure. I can just change the TC quotes inside of it, simple as that, okay, the third benefit I see here is that it's designed for loose coupling.
Now notice that I put my entity framework in a class. library, but it still fought me and in fact, if we look at our dependencies and our new get packages, we'll see that I have the entity framework core tools installed and if we go to launch, you'll see that I'm actually using Anthony. framework or in my friend and knows the sequel server and the database context. He knows that I will allow data to access things and entity framework things in my interface that are a little less than disconnected for me. This shock absorber. I can create a class library. that just sends models back, you don't know how it gets them, it moves up, I'm back and that's a little bit more, it's a lot more loosely coupled as far as an application, which means I can create a data access library and pass it . more easily to different applications and to be a little bit more complex framework to make that possible, just a little bit more complex now, this is not an EF core video versus fancy video.
I'm just pointing out the highlights here as to why I still focus more. At Dapper, I do EF Core, the other part is that this channel is focused on helping developers learn C-Sharp. I teach the entity framework or in my Foundation and C-Sharp course series, but I wait until module eight and that's only after I've taught data access directly using dapper for the sequel Lite MongoDB and others, then I teach the core of the entity framework and I go through some of the examples and I talk about why we don't use this tool all the time because this entity framework or this one is dangerous and that's what I taught in my illustration in the course: have you ever seen those big ground engines?
There are really big machines, that's the entity framework or it's a really powerful, incredible machine, but to get into the cockpit of that machine and drive. and operate it you don't just decide one day I'll do it you get trained in it you get a license to use it you have an instructor who teaches you how to use it it takes time to get in that cockpit and operate it That's kind of what the anti-core is. framework: that big dirt engine is dangerous to use if you don't have much experience with it and it can really bite you and the worst part is that the danger doesn't appear until it's too late so that's why I say anti firm is a great tool.
My problem is who uses it and I say that this is an advanced development tool that only advanced entity framework core developers should use again, without trying to stop here. I'm trying to protect you. don't use it, I'd rather have that control with Dapper, like yeah, I'm a sequel developer. I don't do sequels. I have a course to teach how to become a sequel developer, it's not difficult, but that way I see what's going on right away. make changes directly and it's designed to be faster in production and it's very easy to be loosely coupled and really my fancy code is one line to write or read from one line of code and then just call sequel so with that it's not much slower than anti framework or but the results are faster in production, easier to work with and more loosely coupled, so that's my opinion on entity framework, those are my problems with it and so So, best practice I've seen you use, even if there are basics. of this and also the pitfalls that you need to avoid, so really understand how your sequel works, really understand how the entity framework creates your sequel and how to diagnose that sequel is being called and make modifications to it.
Okay, so if you use the entity framework, I definitely encourage you. become an expert in practice, look a lot at what's going on here, the subsequent statements, monitor, understand the performance tradeoffs of everything you do and know how to fix some of those performance issues that are going to happen, so if keep going I'd love to hear your thoughts in the comments. I'd love to know what you think I missed as far as performance or issues occurring within the entity or benefits I missed. Let me know what I think you know using Dapper versus any framework again, please keep it civil.
This is here for us to grow. I love that we are all getting better again. I'm not a big person with entity frameworks, so I'm sure I missed things. If you want to say "hello", this would improve your query or improve something, that's great, yeah. I would really appreciate it if you now prove my point a little because, again, if it's not obvious that this will improve your query, then it's not a thing. That a beginner or intermediate developer will do it, that's fine. I love becoming an expert in the framework or because I would love for you to know how to use the power of some of us.
I mean, this is a pretty powerful thing, it's pretty easy to do, definitely if you have suggestions. It would be great. I would love to hear them. Well, now I have a blog post too. I'll bring it right now. I wrote this in August. I rewrote it. It had an old site. I had a post. I rewrote it and this is what I think about whether I should use the entity framework and my answer is usually no so now I point out: Hello real programmers, you often think I'm an idiot and DBAs often They applaud, well, I have been.
At both types of conferences, I've spoken at conferences in front of developers and they said, why don't you use the entity framework? That's just obvious, simple and I've been to conferences speaking in front of database developers and they said oh great thanks don't use the entity framework please okay there are torus sizes but there are a lot more nuances than both sides want to admit, so I talk about those nuances, okay, so give us a read, let me think you raised some good questions. Here are some questions like: Does your team understand how to use the entity framework well?
That's more than just knowing how to write linked queries. Had your Tina diagnosed problems with a code or did she not perform queries correctly? Do you have to protect those client credentials if you are? When using do PF or wind form or console you are possibly giving away your credentials which could be a problem and someone make a comparison here. Speak. There's that store procedures video where I talk about your database security and how you can lock down security even without encrypting your credentials. What I said for a desktop app doesn't do that. a lot of benefit ok and i talk about hooking up a sequel using dapper properly so i will link the article below in the description read it if you want to learn more.
I didn't cover all of this in the video, but that could also be beneficial, so again give me your opinion. I love talking about this with you and here's your opinion. Well, thanks for watching. I appreciate you putting up with this longer video and as always, I'm Tim Quarry.

If you have any copyright issue, please Contact