To LUGNET HomepageTo LUGNET News HomepageTo LUGNET Guide Homepage
 Help on Searching
 
Post new message to lugnet.off-topic.geekOpen lugnet.off-topic.geek in your NNTP NewsreaderTo LUGNET News Traffic PageSign In (Members)
 Off-Topic / Geek / 3840
3839  |  3841
Subject: 
Re: BI Portal site re-launches!
Newsgroups: 
lugnet.build, lugnet.off-topic.geek
Followup-To: 
lugnet.off-topic.geek
Date: 
Tue, 9 Jul 2002 15:25:14 GMT
Viewed: 
24 times
  
In lugnet.build, Larry Pieniazek writes:
In lugnet.build, Jake McKee writes:
In lugnet.build, Ross Crawford writes:

As Oliver says, de-normalising the database may be an effective way of
reducing the server load. Store the tree total against each category, and
update it when a link is submitted.

Can you explain this a little more? I intrigued by the concept, and
completely confused. My (limited) knowledge of databases tells me that
normalizing the DB is a way to add efficiency. What I am missing in your
suggestion?

Normalising makes things more space efficient (by reducing storage of
redundant information), denormalising makes things more efficient at runtime
(by reducing joins).

Storing the tree total against each category is an example of keeping
redundant data instead of calculating it on the fly (I forget if this is a
kind of denormalization or not and if it is I forget whether it's "first
normal form" or "second normal form" or what... I play a DBA at clients but
am not one, technically). Rather than doing a query to calculate the total
when you are asked to, you store the info and recalculate when you know the
value would have changed (when someone adds or removes something).

Other kinds of time/space tradeoffs include things like moving code table
values into the main rows.

Try this URL:

http://www.palslib.com/Fundamentals/Database_Design.html

particularly this article

http://www.dbpd.com/vault/9804date.htm

Note that fully normalized databases are the most compact and have no
redundant information (and thus have the highest data integrity) but also
are the slowest.

The downside to denormalizing is you have to remember to update your data in
several places whenever you update it.

Depending on how complex you want to make your code it can be worth it to
denormalize in some places to allow quicker searches. Databases are as much
art as science. You are better to design the data model in third normal form
and figure out where you will denormalize it than design it denormalized to
begin with. (Follow the rules before you use the exceptions) This way you
can have the best of both worlds, efficient size and speed.

Jude

XFUT o-t.geek



Message has 1 Reply:
  Re: BI Portal site re-launches!
 
(...) It also depends on the database usage - if updates are frequent, de-normalised databases start losing their efficiency, but with a database such as this, that shouldn't be a big issue. You can make the update code simpler by using stored (...) (22 years ago, 10-Jul-02, to lugnet.off-topic.geek)

Message is in Reply To:
  Re: BI Portal site re-launches!
 
(...) Normalising makes things more space efficient (by reducing storage of redundant information), denormalising makes things more efficient at runtime (by reducing joins). Storing the tree total against each category is an example of keeping (...) (22 years ago, 9-Jul-02, to lugnet.build)

20 Messages in This Thread:








Entire Thread on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact

This Message and its Replies on One Page:
Nested:  All | Brief | Compact | Dots
Linear:  All | Brief | Compact
    

Custom Search

©2005 LUGNET. All rights reserved. - hosted by steinbruch.info GbR