Data or data design elements that do not conform to the rules of data normalization. Denormalized data structures are often used in databases to provide rapid access for specific user needs. Denormalization usually results in some degree of data redundancy in a data record. A process of combining like data into a single entity (table or file). This combining will create duplicate data.
(RM) Intentionally "breaking the rules" of normal forms. normalize.php
An intentional violation of the rules of normalization done to increase performance of a database. It typically occurs in varying degrees during all phases of physically implementing a database. Database designs are often denormalized to accomplish a specific performance related goal. Denormalization cannot be done without a thorough understanding of the data and the needs of the customer.
The opposite of data normalization (almost). In a denormalized database, some duplicated data storage is allowed. The benefits are quicker data retrieval and/or a database structure that is easier for end-users.
the technique of placing normalized data in a physical location that optimizes the performance of the system.
Initial database designs are often normalized such that information is recorded once and only once, and cross-related to all related data. This gives a very flexible design, which can sometimes give poor performance. Denormalization is the design process by which means there is controlled replication of information, the introduction of derived columns and, in rare cases, repeating data to enable the system to meet some performance goal.
The process of restructuring a normalized data model to accommodate operational constraints or system limitations.
Denormalization is the process of attempting to optimize the performance of a database by adding redundant data. It is sometimes necessary because current DBMSs implement the relational model poorly. A true relational DBMS would allow for a fully normalized database at the logical level, while providing physical storage of data that is tuned for high performance.