2004. április 22., csütörtök
TQuery vs TTable components: why upgrade?
Problem/Question/Abstract:
Using TTable was absoltely normal in a near past, when the data resided locally, but is a common (and great) mistake in a modern, network environment.
Why should I use Query components instead of Tables? Is the change in the existing projects worth the efforts?
Answer:
Until recently, many programmers were told to use extensively in their programs Table Datasets instead of queries.
In a single user environment, where all the data resides locally, this can be an easy and rapid solution for the basic needs, and most programs seemed to function perfectly with this approach in the near past.
But a Network Environment is NOT a Single User Environment: let’s get a brief look to the differences between them.
First of all, data stored in tables shared on a network are often by far bigger then single user’s ones, as the data in an organization are of course bigger of the data of a single user: for example, I have seen Paradox tables of more then 80 Mbytes, truncated every 6 months because they were to big to be used with the BDE.
The BDE cannot simply manage huge quantities of data with local drivers (Paradox, dBase). The problem in this case is that using tables, you have to load in memory ALL the datas: if you’re lucky enaugh, and the programs are slow and take dozens of Mbytes of RAM. If you’re not so lucky, index corruption problems affect your databases many times every day, too, as a consequence of concurrent access on big physical tables.
In a second place, you haven’t control on the fields you need: think about a catalog.db in which there is also a big BLOB field with a TIFF boxshot of a product inside… if you use tables, you have to load it in memory even if you don’t need it, for example if you are an accountant or the CEO and you are interested only in the most important aspects of a catalog (SKU, DESC, PRICE).
Using Table components, which load in memory all the fields, can fill the RAM of the PCs, so it is a great cost for the company, too. BE ADVISED: adding field definitions to the table doesn’t solve the problem! The fields not defined are anyway loaded in the PC’s RAM and their datas flow on the network creating a great traffic jam on it!
In a 3rd place: not only you have to load in memory ALL THE FIELDS, even if you don’t want some of them; you also have to load in memory from a shared network ALL THE ROWS of the table; most middlewares (BDE included) need this even if you try to “filter” the table.
Finally: if you have to calculate avarage, min, max values for categories inside a database, you have to build yourself code using a Table component; if you use Query components instead of Tables, you can use the aggregated functions of the Queries, making the server work for you and simplifying radically the development process. Using TQueries, you can make advanced statistics in a snap.
Also, if you use a Database Server like Interbase or MS-SQL, the Server takes charge of filtering the result set, reducing the CPU, Bandwith and RAM requirements of the the client up to the 95% and giving the client program a quicker response to the user’s actions.
Despite the (little) effort needed to change existing projects, I suggest all the programmers, in particular those with Ttable components in their programmer’s DNA, to use Queries also in old projects and not only in newer ones; this is by far the best solution to achieve better performance in your EXEs.
Remember: using Queries can lead to incredible performance gains, both in CPU, Bandwith and Ram occupation on older machines, increasing their productive life and reducing costs, so the great savings achieved are by far worth the (small) effort needed.
Feliratkozás:
Megjegyzések küldése (Atom)
Nincsenek megjegyzések:
Megjegyzés küldése