So are you a db2 dba in a production environment and got stuck with a packaged app... You can not change the application nor you can change the database schema... oops...
May be the application is using non-parameterized queries!!! or may be they are sorting the data too much to fetch the first few records say on the basis of some date column... Too much of data is being stored in the same table, the record count forces a table scan!!! :(
and many more problems which could have been avoided at the design time, but you can not help it now... You have to pay for any changes... :-o
What to do???
a few tips... (* make sure you are on latest db2 version)
1) If you find that the application is not using parameterized queries and the same query is being compiled again and again, db2 9.7 has something special for you: db cfg stmt_conc. This ensures that the query parser replaces the literals with a bind variable if a similar query is identified it is used instead of compiling again...
update db cfg using STMT_CONC LITERAL
those of you who are coming from an Oracle Background can refer to cursor_sharing
2) If your application has too many fetches for first few rows and you use ROW_NUMBER for windowing and then filtering the top result... Please do not do that... Use optimize for first N rows or fetch first N rows... Those of you who are do so for ANSI compatibility or are simple stuck with a packaged application, you can got for setting OPTIMIZEFORNROWS in db2cli.ini
3) Well this one is a bit of trade-off... If you feel that the application is using queries which could have been optimized in a much better way, and they are nt ready to accept a change without funds... set the optimization level little higher to allow db2 use more brain for queries....
4) If you see that the issue is with only a few queries, do not change the optimization level rather use optimization profile
Well it is something oracle hints but not exactly the same... I can not explain the whole concept here, but yeah go thru this article on developer works...
5) If you feel the table should have been broken down into several entities, say account table could have been divided in to CREDIT ACCOUNTS and DEBITS ACCOUNT. But clubbing them together has caused an enormous row count... you can either actually break the table and create view on it having the same (older )table name... or a better way is to use db2 range partitioning feature (oh baby you gotta upgrade to v9)
6) You see that a table goes to high no of records with in a single transaction and the same part/module of the application will delete all the rows. Now the problem with this is
i) at any given point of time you table will have 0 records unless
this module is running
ii) Whenver RUNSTATS run (either auto or your cron job) the table
has got 0 records
iii) now when your module populates data all of a sudden the stats
will not be effective immediately but a fetch (probably in same
module) may be immediate
iv) The optimizer thinks the table is empty and generates a wrong
plan (a costly one) and avoids any index scan
well this is a problem you will face with temporary processing tables. The actual business logic could have been achieved with global temp tables or may by retaining the rows for next run of module (rows are deleted in starting and not at end hence your runstats always picks up a non-zero value as record count)... But that can not be done now...
a trick... say
alter table tabname volatile
this will enforce an index scan if the table is indexed
7) Use query compiler environment variable as below to optimize the queries before getting executed:
8) Use db2_workload to set proper category of your application, e.g. it can be set at once to group several environment variable hence to facilitate a better performance with that application... e.g. it can be set to cognos or sap etc...
Well now m feeling sleeepy... will modify the post if i can recollect more tips for dbas STUCK with packaged apps...
- Singapore, Singapore, Singapore
- I am currently working as a DB Consultant with Ashnik. We are into Open Source and Cloud based enterprise class solutions. Prior to this I was working as a DBA @ Misys India (Banking Services). Working on IBM DB2, Oracle Microsoft SQL Server. Selected as IBM Champion for Data Management 2011. I like bit of photography, cycling, poetry and I just love gadgets!!