Ask Question
30 September, 13:16

For a DBMS to be efficient and more functional, data redundancy needs to be eliminated. This can be accomplished by creating smaller tables, with stable data structures, using what process?

a.) Data administration

b.) Data dictionary

c.) Data normalization

d.) Data retrieval

+5
Answers (1)
  1. 30 September, 13:24
    0
    c.) Data normalization

    Explanation:

    Data normalization is the a process to organize tables in the database in such a way that data redundancy and anomalies (as a result of data manipulation) can be eliminated. So data normalization arranges the data in the form of tables, remove redundant data from the tables and eliminates duplicated data and also makes a large table into smaller tables. These small tables can then be associated to each other by making relationships between them. Data normalization also reduces the problems created by the data modification. These are often referred to as data anomalies e. g. insertion, deletion and update anomalies. Lets say a large table has redundant or repeated data. Without normalization the redundant data will take more space in the memory. Also it will not be easy to update data in such a table as it might result in data loss. 3 basic types of database normalization forms are First normal form (1NF), Second normal form (2NF) and Third normal form (3NF). Others include Boyce-Codd Normal Form (BCNF), Fourth Normal form etc. 1NF states that columns of the tables which are also called attributes should have atomic values, 2NF form handles Partial Dependency and 3NF form handles Transitive Dependency.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “For a DBMS to be efficient and more functional, data redundancy needs to be eliminated. This can be accomplished by creating smaller ...” in 📗 Computers & Technology if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers