Placing Data for Performance. Fuhgeddaboutit!

Bill Cole

Bill Cole – Competitive Sales Specialist,Information Management, IBM

I like racing. Not watching it on television. That’s like watching a peach rot. It’s not quite so bad in person, though. I raced cars on ovals and road tracks and drag strips – not that you care – and one thing every racer knows is position counts, even in drag racing. Every bowler knows the same thing, too. Put the ball in the right place and you’ve got the best shot at a strike. It’s the same for every sport, right? Lob the ball to the strike zone and duck!

Data is much the same way. (Bet you wondered where I was going.) It’s not all the same. Access patterns for each row of data are different. They change over time and we just don’t have the time or resources to make the changes that would keep database performance where it should be so the throughput we deliver meets the business needs. And those needs aren’t static, are they? What was adequate yesterday is deadly slow today. So we’re always chasing the needle, right?

This means that we’re spending time on the same tasks over and over. Grab a performance report (pick your favorites; i tend to favor nmon and Data Console) and set out on the expedition of re-discovering the same information. A few queries aren’t tuned correctly, memory needs a bit of tweaking here & there, and some disks are taking a severe beating. Same old things. We know how to fix each of them.

So why are we spending any time at all chasing disk access issues? Sure, there’s a little period of adjustment when a new application settles in and we learn its performance characteristics/anomalies. But the honeymoon has long been over and we’re still chasing the same things for applications and environments that are old frenemies.

I just sat through an hour dissertation by a red product manager that purported to explain how that database manages (for an extra license fee, of course) data heat. That is, understanding the access characteristics of data objects and then deploying them to the appropriate media. The reason for the license fee is the use of some different forms of compression. Did I mention that the DBA has to manage the whole scenario? Or that two forms of compression will slow you down? Drastically? [Rant off]

DB2 has been doing this for a few releases now without the aid of extra license fees or involving the DBA in the process. The whole process of managing data for DB2 is done in the background. The database determines the access pattern and then moves the data to the appropriate media. No new bizarre compressions, no performance hits. Just set it and forget it. Sorry, that’s an old ad jingle. Just set up tablespaces on media with different characteristics.

The setup could be as mundane as RAID 5 and RAID 10 or more esoteric using Flash disk devices for very high-performance data. And DB2 moves the data around for you based on access patterns. Much easier than you trying to determine which tables need to be moved every month/week/day. No symbolic links to maintain. Let the database do the walking, as it were. After all, the database knows more about those patterns than we can ever discern through all those reports. And all of this is just part of the license fee for DB2. You get sleep as part of the deal. And who can argue with a few hours extra sleep?

Oh, and compression is there because you have DB2. Lots of different kinds of compression to save space and improve performance. No exotic (read that “silly”) compression algorithms to decode. Just the best compression you can find for your database. Easy. Free. Fast. Any questions?

Finally, my wife won’t let me race any more. Says it scares her. But I miss the thrill of the race. Getting myself into position to win. It’s all about position. Where you can go the fastest. It’s always clean air when you’re in front. Just like your data. Life is so much easier when you’re in front of the performance curve rather than chasing the needle.

And please download the DB2 database poster and Pin it your wall. It’ll answer some questions and make you look even smarter!

Follow Bill Cole on Twitter : @billcole_ibm

Learn more about how simple to use DB2 with BLU Acceleration is-

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: