The founding fathers of the relational data base, by using a combination of mathematics and black magic, came up with a set of rules defining the make up of a pure relational database. Nearly all of these rules have been bent or ignored by relational data base vendors and implementers, and for a good reason, a compromise between purity and performance is essential in the real world. A good example of such compromise is in the degree of normalization that is implemented in a data base. Normalization is measured on a scale of 1 to 5 "Normal Forms" - a kind of Richter Scale of database purity. The first normal form states that one value in a table should be just that - i.e. it is acceptable to have a well name column, with one value per line, but not to have a well names column , with a group of well names in a single field. The second normal form states that a row in a column must be uniquely identifiable while the third eliminates redundancy of non-key information. While these three requirements combine to make for robust database design and simple maintenance, they can have a negative effect on performance. Because of this they are frequently only partially applied.
Click here to comment
on this article
If your browser does not work with the MailTo button, send mail to firstname.lastname@example.org with PDM_V_2.0_199611_4 as the subject.
© Oil IT Journal - all rights reserved.