Inside IBM Insight 2014: See How Clients Are Revolutionizing Their Business With DB2

cassieBy Cassandra Desens
IBM Software Group, Information Management  

This will be my second year at Insight and I must say, it is my favorite time of the work year. As frenzied and busy as the planning is, the return on intangible investments is worth it. Ask anyone – after just a few short days in Vegas you walk away with a renewed sense of excitement around IBM, its products, lots of free stuff, and usually a head cold  (I’m no doctor, but casinos are definitely germ incubators).

Last year I was amazed at the hundreds of events and opportunities the conference presented by customers. Demos, a city-sized EXPO floor, labs, 1-on-1s, both large general & breakout sessions,  product specific & networking events…the list goes on and on. There are so many different ways to spend your short week with us at Mandalay Bay. The question is – what will you choose to attend?
Allow me to suggest the client-led sessions.

As I said in my last blog entry, Balluff loves BLU Acceleration too: “Hearing customer stories puts our product claims into perspective. Success stories give us the ultimate answer to the elusive question “How does this relate to me and my business?”.”

There is nothing that gets me more excited than hearing how a customer was able to revolutionize their business with DB2 for Linux, UNIX and Windows, so today I want to recommend 5 Insight2014 sessions for you to attend to hear the way companies around the world are leveraging DB2. I encourage you to weave these into your busy schedule next week. I promise you won’t be disappointed!

1) IWA-4145A: Keeping Data Analysts in the Pink: IBM DB2 with BLU Acceleration on Power System at BlueCross BlueShield of TN
Date & Time: Monday October 27th, from 2:00-3:00pm
Speakers: Randy Wilson Lead DB2 for LUW DBA at Blue Cross Blue Shield of Tennessee, and Danny Arnold, Software Client Leader at IBM
Abstract: When BlueCross BlueShield of Tennessee (BCBS of TN) data analysts ran certain queries, the queries did not run in a timely manner and took hours to complete. In need of a solution, the company decided to test the IBM DB2 10.5 with BLU Acceleration technology. The initial results were astounding, showing queries completing in just seconds, not hours. The company then installed DB2 10.5 with BLU Acceleration on a large test environment on the IBM POWER7 system. It worked with its data analyst team of IBM Cognos and SAS users to identify queries that could benefit from the BLU Acceleration technology. In this session, you learn about BCBS of TN’s experience with DB2 BLU and how it allows the company to make better business decisions.

2) IWS-5729A: IBM BLU Acceleration: Adoption Guidelines and Lessons Learned at Coca-Cola Bottling Company

Date & Time: Monday October 27th, from 3:30-4:30pm
Speakers: Andrew Juarez, Lead SAP Basis DBA at Coca-Cola Bottling Co., and Naresh Chainani, DB2 Development at IBM
IBM DB2 with BLU Acceleration can provide performance improvements that range from 10 times to 25 times and beyond for analytical queries with minimal tuning. If you are eager to understand how your business can take advantage of this technology for your warehouse or data marts, attend this session. You learn how quick and easy it is to get started with BLU. You see a specific real-world example based on the Coca Cola Bottling Company’s experience with DB2 BLU, from installation to performance tests. You also see how DB2 BLU fits into the SAP BW environment.
3) IWS-5338B: Why the University of Toronto Loves BLU: Faster, Smaller, and Simpler in Only a Few Hours
Date & Time: Tuesday October 28th, 10:00-11:00am
Speakers: Brenda Boshoff, Senior Database Administrator at University of Toronto, and Adam Storm, Senior Technical Staff Member and Master Inventor at IBM
Abstract: It’s not every day that a new database technology promises revolutionary performance gains, improved compression and radically reduced time to value. It’s even less often that such aggressive claims can be validated by customers. This session takes you through the adoption cycle of IBM BLU Acceleration at the University of Toronto. You learn how an aggressive one-day proof of concept achieved dramatic performance gains in less than three hours. You gain insight into the process of taking BLU Acceleration through the QA cycle and into production. If you’re considering moving to BLU Acceleration, attend this session to see how the University of Toronto successfully adopted the technology in its environment and achieved dramatic results.

4) IWS-5787A: Simplify your SAP Landscape – Superior SAP BI Performance @ Yazaki with DB2 BLU
Date & Time
: Wednesday October 29th from 11:15-12:15pm
Speakers: Juergen Laudien, Infrastructure Manager at Yazaki Europe, Bernd Ertl, Competitive Sales Lead at IBM, Karsten Stoehr, Competitive Database Sales Specialist at IBM, Carsten Dieterle, SAP Infrastructure Solution Architects (SOLAR) – SAP Growth Program at IBM
Abstract:The topic is about simplifying your SAP BW landscape by taking SAP BWA and HANA appliances out of the IT infrastructure and replace it with DB2 BLU instead of SAP HANA – Many DB2 clients have done this. They have gained their experiences in proof of concepts as well as in production environments which prove, that DB2 BLU needs up to 85% less cores/storage/RAM compared to SAP BWA or HANA appliances. This huge amount of HW savings and the reduced SW license costs allow clients to run their business faster and more efficient on DB2 BLU.

5) IDB-4606A: Ameriprise Financial’s Deployment Experiences with IBM DB2 High Availability Disaster Recovery
Date & Time: Wednesday October 29th from 4:30-5:45pm
Speakers: Chip MicAllister, UDB Database Technical Lead at Ameriprise and Dale McInnis STSM: Chief Architect IM Cloud Availability
Abstract:  In this session, you learn about the high availability disaster recovery (HADR) feature in IBM DB2. Then, you will hear about Ameriprise Financial’s experience when deploying DB2 HADR in its production environment.

Safe travels everyone and see you soon!

Balluff loves BLU Acceleration too

cassieBy Cassandra Desens
IBM Software Group, Information Management  

BLU Acceleration is a pretty darn exciting advancement in database technology. As a marketing professional, I can tell you why it’s cool..
BLU provides instant insight from real-time operation data,
BLU provides breakthrough performance without the constraints of other in-memory solutions,
BLU provides simplicity with a load-and-go setup,
etcetera, etcetera ..you get the point.

You can read our brochures and watch our videos to hear how DB2 with BLU Acceleration will transform your business. We think it’s the next best thing since sliced bread because we invented it. But is it all it’s cracked up to be? The answer is YES.

Clients all over the world are sharing how BLU Acceleration made a huge, positive difference to their business. Hearing customer stories puts our product claims into perspective. Success stories give us the ultimate answer to the elusive question “How does this relate to me and my business?”. Which is why I want to share with you one of our most recent stories: Balluff.

Balluff is a worldwide company with headquarters in Germany. They have over 50 years of sensor experience and are considered a world leader and one of the most efficient manufacturers of sensor technology.  Balluff relies on SAP solutions to manage their business, including SAP Business Warehouse for their data analysis and reporting.

Over the last few years Balluff experienced significant growth, which resulted in slowed data delivery. As Bernhard Herzog, Team Manager Information Technology SAP at Balluff put it “Without timely, accurate information we risked making poor investment decisions, and were unable to deliver the best possible service to our customers.”

The company sought a solution that would transform the speed and reliability of their information management system. They chose DB2 with BLU Acceleration to accelerate access to their enormous amount of data. With BLU Acceleration Balluff achieved:

  • Reduced reporting time for individual reports by up to 98%
  • Reduced backup data volumes by 30%
  • Batch mode data processing improvements by 25%
  • A swift transition with no customization needed; Balluff transferred 1.5 terabytes of data within 17 hours with no downtime

These improvements have a direct impact on their business. As Bernhard Herzog put it, “Today, sales staff have immediate access to real-time information about customer turnover and other important indicators. With faster access to key business data, sales managers at Balluff can gain a better overview, sales reps can improve customer service and the company can increase sales”.

Impressive, right? While you could argue it’s no sliced bread, it certainly is a technology that is revolutionizing reporting and analytics, and worth try. Click here for more information about DB2 with BLU Acceleration and to take it for a test drive.

_________________________________________________________________

For the full success story, click here to read the Balluff IBM Case Study
You can also click here to read Balluff’s success as told by ComputerWoche (Computer World Germany). Open in Google Chrome for a translation option.

Exclusive Opportunity to Influence IBM Product Usability: Looking for Participants for Usability Test Sessions – Data Warehousing and Analytics

Arno thumbnail 2By Arno C. Huang, CPE
Designer, IBM Information Management Design
IBM Design making the user the center of our productsThe IBM Design Team is seeking people with a variety of database, data warehousing and analytics backgrounds to participate in usability test sessions. We are currently looking for people who work in one of the following roles: DBA, Architect, Data Scientist, Business Analyst or Developer. As a test participant, you will provide your feedback about current or future designs we are considering, thus making an impact on the design of an IBM product and letting us know what is important to you.

Participating in a study typically consists of a web conference or on-site meeting scheduled around your availability. IBM will provide you with an honorarium for your participation. There are several upcoming sessions, so if you’re interested, we’ll help you find a session that best suits your schedule. If you are interested, please contact Arno C. Huang at achuang@us.ibm.com

Rapid Insight With Results: Harnessing Analytics in the Cloud

basirBy Basiruddin Syed
DB2 Social Marketing Manager

“It’s Now or Never” is a popular song recorded by Elvis Presley and the lyrics of this song seem to appeal to us now more than ever. Every day we are making decisions in our lives, both personal and financial. We are often seen contemplating which holidays to book, is this right time to invest, etc. We are constantly making decisions and our decisions are based on information, user reviews, and recommendations.

When individuals are facing such pressure to make quick decisions, Imagine how much harder it must be for larger organizations.

Decision-making can be regarded as the cognitive process resulting in the selection of a belief or a course of action among several alternative possibilities.

Individuals are facing a lot of pressure to make rapid decisions and choose the best course of action within a short span of time, and it much more difficult when the time drastically reduces for organizations who’s needs are much higher and the time frame is compressed.

Today’s data-driven organization is faced with magnified urgency around data volume, user needs and compressed decision time frames. In order to address these challenges, Organizations are exploring cloud based BI and analytical technology to accelerate decision making and enhance business performance.

Find out more on how organizations are making this possible “Rapid Insight with results: Harnessing analytics in the cloud”

 

incldm

Make Your Apps Highly Available and Scalable

By Vinayak Joshi
Senior Software Engineer, IBM

The IBM premium data-sharing technologies offer unmatched high-availability and scalability to applications. If you are a JDBC application developer wanting to explore how these benefits accrue to your application and whether you need to do anything special to exploit these benefits, my article – “Increase scalability and failure resilience of applications with IBM Data Server Driver for JDBC and SQLJ” – is a great source of information.

In the article, I explain how turning on a single switch on the IBM Data Server Driver for JDBC and SQLJ opens up all the workload balancing and high availability benefits to your JDBC applications. There is very little required for an application to unlock the workload balancing and high availability features built into the DB2 server and driver technologies.

For those curious about  how the driver achieves this in tandem with pureScale and sysplex server technologies, the article should provide a good end-to-end view. While all the nuts and bolts explanations are provided, it is stressed that all of it happens under the covers, and beyond the bare minimum understanding, application developers and DBA’s need not concern themselves with it too much if they do not wish to.

The aspects a developer needs to keep in mind are highlighted and recommendations on configuring and tuning applications are provided.  We’ve made efforts to keep the reading technically accurate while keeping the language simple enough for a non-technical audience to grasp.

Any and all feedback shall be much appreciated and taken into account. Take a look at the article by clicking here, and feel free to share your thoughts in the comment section below

Data Analytics, or How Much Info for a Buck?

Bill Cole

Bill Cole – Competitive Sales Specialist,Information Management, IBM

Leave only footprints; take only pictures.  Have you seen that slogan in a national park?  My wife (she’s now an ex) didn’t believe the signs that told us to leave everything exactly where it was.  She didn’t want to just enjoy the beauty.  She wanted to take some home with us.  The flashing light of the Park Ranger car told me we were in trouble for picking up a few rocks along the side of the road.  The nice man in the Smokey hat told me to put the rocks back.  The scenery is for consumption with your eyes, your camera, not for taking home.  I did as instructed, happy to be leaving with my wallet in one piece.

I’ve always produced data and then turned it into information by adding other bits of data together and adding some context.  My users guided me for a while and then I both guided and pushed them.  This seemed to be the natural order of things, sort of like factories and the folks who buy the goods from those factories.

The IT/BI/DA teams accumulate and store the data and then massage to build what are essentially standard reports.  Standard reports are good for standard thinking, of course.  If you know the answer you’re looking for, a standard report probably has it in there somewhere, like those old balance sheets and ledgers that I ran so long ago.  But there was nothing in those reports that would help think outside of the data on those reports.  In fact, there was so little insight in them that one of the plant managers actually asked me what good these reports were.  There’s really not a good response to that one.

Insights are gained when the lines of business can chase an idea through all sorts of non-standard iterations.  Almost like chasing one of those happy mistakes from science, like penicillin, or those ubiquitous not-very-sticky note sheets that we all stick all over everything so we can easily keep track of passwords, etc.  LOL, like you haven’t done that.

So how do we get to this idea-chasing sort of thing?  This place where the data analysts or, better still, the line of business user can see something interesting and start chasing it?  This is custom-developed solution, a virtual pair of bespoke shoes that were for your situation and only for your situation.  The person in the next cubicle needn’t look over your shoulder.  It would do them no good after all.  There’s a scene in the Maureen O’Hara/John Wayne move “The Quiet Man” in which John asks directions and the local says “Do you see that road over there?  Don’t take it, it’ll do you no good.”  Insights are like that.  You need to know not to walk down a road that will do you no good.

The trick, it seems to me, is having the right tools.  Let’s start with the database (you know I’m a practicing DBA and that means all discussions start with the database).  DB2 BLU is exactly the right repository for your decision-making data.  After all, it offers both row- and column-oriented models in a single database!  This means you’re getting performance no matter which way your data chooses to be represented.  Moreover, there are different kinds of compression to ensure you save space and improve performance.  What could be better?  And all for the price of an upgrade!  Easy.  No-brainer.

There’s a neat coda to this, too.  You’re not confined to the old solution of finding a server, building it and installing the software, then building the database.  Let’s talk choices, folks.  Lots of choices.  Maybe every choice.  On premise, just like we’ve always done, works.  Maybe your own cloud would be better.  Build your BI/DA system in a PureFlex or PureApp or PureData cloud hosted in your own data center.  There’s a simple solution with lots of benefits including workload management.  Set it and forget it and go on about your business.  Maybe DBaaS works better.  Virtualize the workload and database in an existing private cloud to make use of those “excess” mips.  (Parkinson’s Law says that any organization grows to fill all the space available.  I think the demand for mips grows to fill the available servers, thus negating the concept of “Excess mips.”)  There’s SoftLayer for either a public or private cloud.  Remember, they’ll go all the way to bare metal if that’s what you need.  Finally, maybe best, is DB2 BLU available in the cloud. I championed this a while back and it’s now reality.  A pre-configured database that IBM manages and maintains, including backups and upgrades.  Talk about easy!  Go ahead, get some sleep.  We’ve got this one.

One last thought about the tools.  InfoSphere Analytics Server will do the analysis for you and present your users with suggested insights right out of the box.  And it will help the folks find their own insights by helping them look, filter and massage the data in any way that suits them.  It’s a cool tool for those times when you need the freedom to find your own way through the forest of data.

Finally, I’ve always kept two Robert Frost poems on my wall.  Perhaps, “Two Roads Diverged in a Yellow Wood” is the one for this post.  We in IT need to give the folks in the lines of business the right tools to chase down the new roads, new insights.  We’ll give the GPS for the roads less traveled by.  Good luck on your journeys of exploration!

The other poem is “Stopping By Woods On a Snowy Evening,” of course.  We all have miles to go before we sleep, before our work is complete, and using the right tools makes those miles ever so much more productive.  Bundle up on those snowy evenings and enjoy the ride.

Follow Bill Cole on Twitter : @billcole_ibm

Visit the IBM BLU HUB to learn more about the next gen in-memory database technology!

It’s Obvious. It’s in the Data.

Bill Cole

Bill Cole, Competitive Sales Specialist,Information Management, IBM

You’ve had that experience, right?  Somebody says that the answer is in the data so you look harder and all you see is stuff.  There’s not a pattern within a grenade blast of this data.  Maybe if you had a bit more time you’d find it.  Or maybe having the data in the right format would make a difference.

 We all know the traditional relational database isn’t a great platform for analyzing mass quantities of data.  Your OLTP relational database is built for processing small-ish transactions, maintaining data integrity in the face of an onslaught of concurrent users all without regard to disk space or processor utilization.  Abuse the resources to get the performance you need!  To paraphrase John Paul Jones: Ignore the checkbook, full speed ahead!

So we learned to build special-purpose structures for our non-transactional needs, and then manage the fallout as we tried to find anything that even smelled like (consistent) performance.  Each step forward in the data warehouse arena was a struggle.  We demanded resources or explained away failures with a wave of a disk drive or processor.

This situation was clearly not good for our mission of analyzing great chunks of data in a reasonable time.  Subsets of data – data marts – were used to work around our limitations.  But this meant we were either replicating data or losing some data that might be useful in other queries.  Clearly not the best of situations.

Our friends out in Almaden studied the problem and found that column-oriented tables were the best basis for a solution.  After all, we were gathering up large quantities of raw data and analyzing it, not processing OLTP transactions.  There would be little need for those annoying special-purpose structures.  Nor would we need any indexes.  All this would save lots of space and reduce processing time, too, so we could achieve not only predictable performance but VERY good performance.  The kind of performance our friends in the business needed to build better relationships with suppliers and customers.

The implementation of the new analytics platform is in DB2 10.5 with BLU Acceleration (The answer to why “BLU” is in an earlier blog entry.).  The very cool thing is that BLU is an option you can choose for either the entire database or just the analytics tables.  So you can have your traditional row-oriented tables and the column-oriented tables in a single database if that suits your design.  No need to learn and maintain a whole new technology just for your analytics.

And we can’t forget the synergy with Cognos.  After all, the two products are developed just a few miles from each other.  Turns out the Cognos folks help the DB2 team by sharing typical analytics queries and the DB2 team uses those examples to tune the query engine.  Nice!  Of course, this helps out with the queries we build ourselves or through – gasp!  – other products.  Oh well, DB2 is there to make us all look good.

A quick refresher on column-oriented data.  The easiest way for me to think about it is that we’ve stood the database on its side so that instead of seeing everything in rows we’re seeing the data in columns grouped together.  A typical description of a table has the column names running across the top of the page which is analogous to the way data is stored in a most relational databases.  However, the column-oriented table has the data for a column grouped together and the rows are built by assembling the data from the columns.  Not ideal for OLTP but excellent for processing gobs of data that’s particular to a group of columns.  (There’s a fuller discussion of this in a previous blog post.)  No need for indexes since we’re not looking for individual rows.

The sort of performance users have reported with DB2 and BLU Acceleration, is nothing short of amazing.  Double-digit improvements in throughput.  And it’s this reliably predictable performance that allows us to build those applications that require sub-second kind of analysis.  You know the ones I’m talking about.  While you are on the phone or a web site, the agent or the site offers you options based on YOUR previous interactions, not just options for any random caller or user.  The options are specific because we can analyze data in the time you’re on the phone or a web site.

Finally, I’m told the mark of genius is being able to connect seemingly random dots into a pattern.  You know those folks who are at the conclusion while the rest of us are still just looking at the dots.  You don’t need a genius if you’ve got BLU!  You’ll find that pattern/information gem in record time, too.  And you’ll show the business that you’re delivering the data they need when they need it.

Know more about the innovative technology in BLU Acceleration through this video series on YouTube!