To LUGNET HomepageTo LUGNET News HomepageTo LUGNET Guide Homepage
 Help on Searching
 
Post new message to lugnet.off-topic.geekOpen lugnet.off-topic.geek in your NNTP NewsreaderTo LUGNET News Traffic PageSign In (Members)
 Off-Topic / Geek / 3842
3841  |  3843
Subject: 
Re: BI Portal site re-launches!
Newsgroups: 
lugnet.off-topic.geek
Date: 
Wed, 10 Jul 2002 12:15:47 GMT
Viewed: 
896 times
  
In lugnet.build, Jude Beaudin writes:
In lugnet.build, Larry Pieniazek writes:
In lugnet.build, Jake McKee writes:
In lugnet.build, Ross Crawford writes:

As Oliver says, de-normalising the database may be an effective way of
reducing the server load. Store the tree total against each category, and
update it when a link is submitted.

Can you explain this a little more? I intrigued by the concept, and
completely confused. My (limited) knowledge of databases tells me that
normalizing the DB is a way to add efficiency. What I am missing in your
suggestion?

Normalising makes things more space efficient (by reducing storage of
redundant information), denormalising makes things more efficient at runtime
(by reducing joins).

Storing the tree total against each category is an example of keeping
redundant data instead of calculating it on the fly (I forget if this is a
kind of denormalization or not and if it is I forget whether it's "first
normal form" or "second normal form" or what... I play a DBA at clients but
am not one, technically). Rather than doing a query to calculate the total
when you are asked to, you store the info and recalculate when you know the
value would have changed (when someone adds or removes something).

Other kinds of time/space tradeoffs include things like moving code table
values into the main rows.

Try this URL:

http://www.palslib.com/Fundamentals/Database_Design.html

particularly this article

http://www.dbpd.com/vault/9804date.htm

Note that fully normalized databases are the most compact and have no
redundant information (and thus have the highest data integrity) but also
are the slowest.

The downside to denormalizing is you have to remember to update your data in
several places whenever you update it.

Depending on how complex you want to make your code it can be worth it to
denormalize in some places to allow quicker searches. Databases are as much
art as science. You are better to design the data model in third normal form
and figure out where you will denormalize it than design it denormalized to
begin with. (Follow the rules before you use the exceptions) This way you
can have the best of both worlds, efficient size and speed.

It also depends on the database usage - if updates are frequent,
de-normalised databases start losing their efficiency, but with a database
such as this, that shouldn't be a big issue.

You can make the update code simpler by using stored procedures and / or
triggers. You can write a trigger on "insert instruction" to add 1 to the
category count, etc., then whenever a new link is submitted the category
total is always updated correctly.

ROSCO



Message has 1 Reply:
  Re: BI Portal site re-launches!
 
(...) And once a day, run a script to recalculate all of the counts from scratch, to make up for any bugs or irregularities or problems that cause the counts to go bad. Steve (22 years ago, 10-Jul-02, to lugnet.off-topic.geek)

Message is in Reply To:
  Re: BI Portal site re-launches!
 
(...) The downside to denormalizing is you have to remember to update your data in several places whenever you update it. Depending on how complex you want to make your code it can be worth it to denormalize in some places to allow quicker searches. (...) (22 years ago, 9-Jul-02, to lugnet.build, lugnet.off-topic.geek)

20 Messages in This Thread:








Entire Thread on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact

This Message and its Replies on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact
    

Custom Search

©2005 LUGNET. All rights reserved. - hosted by steinbruch.info GbR