Database design and ASP.NET application performance

Posted: (EET/GMT+2)

 

Most of the web applications I work with use a traditional SQL Server database to store data. In such applications, performance is greatly affected by the design of the database. At schools, we're being taught to normalize our databases, so that information is stored only once and the contents of each table are clearly defined.

However, when it comes to query (database read) performance (and, in turn, application performance as perceived by the user), such normalization might be one of the major reasons for poor query execution time. This happens in situations where you need to find information from multiple tables at once. In practice, table joins are an expensive operation in any SQL database, including those hosted by SQL Server.

Denormalization is the process of combining data so that all necessary information can be found from a single table. To properly implement such denormalization, you also need to know the (ASP.NET) application design: what data is needed, and how it's being processed inside the application.

Let's take an example. Say you were using your C# skills and Entity Framework to write a web application where the user can search for customer orders and the products included in them with a single query. In a common implementation strategy, you'd have tables for orders, order lines and products. Searching from all of these at the same time, you'd need to join these tables into a single query.

Depending on your database size, such a query can take tens of seconds to execute, which probably isn't good enough for the user. Instead, if you would combine all the query fields into a single table (note: not a view), you could execute the query against a single table, possibly giving you blazing performance.

The only drawback is that it's then your responsibility to keep the data accurate in two places: the original database tables, and the denormalized search table.

Happy coding!