• Skip to main content

Fat Vox

Normalization and Denormalization of a Database

by fat vox

Certain steps must be taken in order to ensure that a database operates to its full potential. The developer must decide on the right level of normalization. Once a database is fully normalized, denormalization can be used to improve performance. Denormalization comes with some negative aspects. However, the denormalization process can help a database by adding more information and capabilities.

Developers ask several questions to determine the appropriate level of normalization for a database. For example: What data should be stored in the database? How will the user access the database? What privileges does the user require? How should the data be grouped in the database? What information in the database is most commonly accessed? How is all information related in the database? What measures should be taken to ensure the data is accurate?

Normalization, a concept developed by E.F Codd, is the method of eliminating redundancy in the fields and tables of a database. The normalization process usually involves dividing larger tables into smaller tables. Once the smaller tables are established the relationships between the tables are defined.

Each level of normalization has a different purpose. The goal of the first normal form (1NF) is to divide the base data into logical units called tables. Once each table has been designed, a primary key is assigned to most or all tables. Second normal form (2NF) seeks to take data that is partly dependent on the primary key then enter the data into another table. The (3NF) third normal form objective is to remove data in a table that is not dependent on the primary key. After the database is normalized to the appropriate level a developer can denormalize data as needed.

Denormalization is the process of optimizing database performance by adding redundant data or by grouping data.The goal of denormalization is to eliminate the inefficiencies in a database. Examples of denormalization include: Materialised views were the “many” objects in a one-to-many relationship as an attribute of the “one” relation. Also, the star schema consists of one or more fact tables referencing any number of dimension tables.

However, Denormalization carries certain risks. It can only be accomplished with good knowledge of the database. Also, denormalization should only be conducted if performance problems indicate that it is needed. Once also must consider the amount of effort needed to denormalize a database and update changes.

Certain disadvantages of denormalization exist as well. Normally database updates occur more slowly. Also, because denormalization is database specific it must be constantly reevaluated when the database changes. Lastly, denormalization can increase the size of tables slowing information retrieval.

However, there are great advantages to denormalization of a database. Denormalization improves the database performance in many ways. It cuts down on the need for joins. Denormalization also improves reading data because it is readily available.

References:

http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.infocenter.dc00169.1270/html/iqperf/iqperf158.html

http://technet.microsoft.com/en-us/library/cc505841.aspx

http://stackoverflow.com/questions/2349270/in-what-way-does-denormalization-improve-database-performance

Related

  • Company Database System Plan
  • Database Indexing
  • Database Justification Statement
  • Moving a MySQL Database to a MongoDB DBaaS
  • Netflix Adds Cartoons to Watch Instantly Database
  • Relational Database Application Life Cycle Models
Previous Post: « Why You Shouldn’t Train Children Athletes Like Adults
Next Post: What is Energy Tapping and How Can it Help You? Dr. Fred Gallo Explains »

© 2021 Fat Vox · Contact · Privacy