• September 22, 2021


Bring Technology Ideas with you

 What normalization is and what role it plays in the database design process?

Data Normalization is the process for evaluating and correcting table structures to minimize data redundancies. Some database developers have create a lot of database problem issues by using many tables and sometimes it causes the data anomalies. In order to reduce the data anomalies in database, there are some techniques which can be used:

    1. First normal form (1NF) is the process of merging table through with some few errors from tables itself. It can be looks as:
      • Repeating group –Group of multiple entries of same type can exist for any single key attribute occurrence
      • Relational table must not contain repeating groups
      • Normalizing table structure will reduce data redundancies
      • There are three steps to enhance these method such as:
        • Step 1: Eliminate the Repeating Groups –Eliminate nulls: each repeating group attribute contains an appropriate data value
        • Step 2: Identify the Primary Key –Must uniquely identify attribute value –New key must be composed
        • Step 3: Identify All Dependencies –Dependencies are depicted with a diagram

    2. Table is in second normal form (2NF) when: It is in 1NF and –It includes no partial dependencies: No attribute is dependent on only portion of primary key.

    3. A table is in third normal form (3NF) when both of the following are true: –It is in 2NF –It contains no transitive dependencies.

    Related post