Revolutionizing analytics with IBM DB2 for SAP

by Sajan Kuttappa, Content Marketing Manager

In-memory computing has gained a lot of traction as a great option to address the demands of use cases like real-time analytics. According to estimates, close to 75% of the use cases for in-memory computing are for real-time analytics.

I want to share this example where Knorr-Bremse – a leading manufacturer of braking systems for rail and commercial vehicles, adopted in-memory technology to scale seamlessly. As Knorr-Bremse continued to grow its global presence through acquisitions and joint-ventures and expanded its product portfolio for the automotive industry, it realized that speed was of the essence in integrating the acquired entities into existing business structures. The large increase in data volumes was impacting analytics capabilities and system throughput, while infrastructure costs were seeing an upward trend. To achieve productivity improvements, the company had to dramatically increase the ability to quickly deliver performance reports. Standardization of processes across the group without slowing down decision-making was also a critical consideration.

Knorr-Bremse chose IBM DB2 with BLU Acceleration to address its need for faster analytics. The company started using BLU Acceleration without changing or expanding its existing IT infrastructure or system administration workflows. By implementing new data compression features available in DB2 BLU Acceleration, Knorr-Bremse cut data volumes of already compressed data by up to 64 percent across all SAP Business Warehouse solutions thereby reducing storage costs. By activating BLU Acceleration, the company reduced the size of its largest analytics databases by a further 10 percent.   Certain reports took 160 seconds to generate in the past, but with IBM DB2 with BLU Acceleration the information is available in 9 seconds (17 times faster). Some of the other key business benefits include 55 times faster insights that enable real-time response to changing markets and 2.4 times quicker analytics in SAP Strategic Enterprise Management

IBM DB2 has a vibrant community of SAP users that span different industries, small, medium and large businesses and government agencies. With IBM DB2 for SAP environments, IBM is in a unique position to offer enterprise clients proven, scalable, high-throughput, highly available database software that supports a broad range of platforms, operating environments and SAP workload types. With the release of DB2 11.1, IBM has added several enhancements to the BLU Acceleration technology that includes the introduction of Massively Parallel Processing scale to BLU.

The annual SAPPHIRE NOW conference will be held in Orlando, Florida from May 16-18 2017. Visit IBM by stopping by booth #400 to see how we are helping clients like Knorr-Bremse solve business and technology challenges while transforming their companies. Don’t forget to explore the IBM sessions at SAPPHIRE NOW here

Webinar: Why IBM DB2 Database Software Leaves the Competition in the Dust

We all know that IBM DB2 database software is an industry leader when it comes to performance, scale, and reliability. But how does it compare to the competition, specifically Oracle and SAPHANA?

IBM’s Chris Eaton joined IDUG’s DB2 Tech Talk to give an update on IBM DB2 and show how DB2 goes above and beyond our competitors to provide reliable functionality while saving businesses money.

During the presentation, Chris walked the audience through DB2’s latest release, DB2 10.5 “Cancun Release” and the four innovations that make BLU Acceleration  different from our competitors: Next Generation In-Memory, the ability to analyze compressed data, CPU Acceleration, and data skipping.

You can watch the entire presentation on the IDUG website by clicking here, and also review the Tweets from the event by logging on to the IBM DB2 Storify page here.

Still have additional questions? Feel free to leave them in the comment box below and we’ll get the answers to you shortly.

Chris eatonAbout Chris Eaton – Chris is a Worldwide Technical Sales Specialist for DB2 at IBM primarily focused on planning and strategy for DB2 on Linux, UNIX and Windows. Chris has been working with DB2 on the LUW platforms for over 21 years. From customer support to development manager to Externals Architect to Product Manager for DB2, Chris has spent his career listening to customers and working to make DB2 a better product. Chris is also the author of The High Availability Guide for DB2 , DB2 9 New Features, Break Free with DB2 9.7: A Tour of Cost-Slashing New Features and Understanding Big Data Analytics For Enterprise Class Hadoop And Streaming Data. Follow Chris on his blog here.

When Your Database Can’t Cut It, Your Business Suffers

larry By Larry Heathcote
Program Director, IBM Data Management

 

Your database is critical to your business. Applications depend on it. Business users depend on it. And when your database is not working well, your business suffers.

IBM DB2 offers high performance support for both transactional processing and speed-of-thought analytics, providing the right foundation for today’s and tomorrow’s needs.

We’ve all heard the phrase “garbage in, garbage out,” and this is so true in today’s big data world. But it’s not just about good data; it’s also about the infrastructure that captures and delivers data to business applications and provides timely and actionable insights to those who need to understand, to make decisions, to act, to move the business forward.

 

It’s one thing to pull together a sandbox to examine new sources of data and write sophisticated algorithms that draw out useful insights. But it’s another matter to roll this out into production where Line of Business users depend on good data, reliable applications and insightful analytics. This is truly where the rubber meets the road – the production environment…and your database better be up to it.

Lenny Liebmann, InformationWeek Contributing Editor, and I recorded a webinar recently titled “Is Your Database Really Ready for Big Data.” And Lenny posted a blog talking about the role of DataOps in the modern data infrastructure. I’d like to extend this one more step and talk about the importance of your database in production. The best way I can do that is through some examples.

 

1: Speed of Deployment

ERP systems are vital to many companies for effective inventory management and efficient operations. It is important to make sure that these systems are well tuned, efficient and highly available, and when a change is needed that it be done quickly. Friedrich ran the SAP environment for a manufacturing company, and he was asked to improve the performance of applications that were used for inventory management and supply chain ops. More specifically, he needed to replace their production database with one that improved application performance but kept storage growth to a minimum. Knowing that time is money, his mission was to deploy the solution quickly, which he did… 3 hours up and running in a production environment with more than 80 percent data compression and 50x performance improvement. The business impact – inventory levels were optimized, operating costs were reduced and the supply chain became far more efficient.

 

2: Performance

Rajesh’s team needed to improve performance of an online sales portal that gave his company’s reps the ability to run sales and ERP reports from their tablets and mobile phones out in the field. Queries were taking 4-5 minutes to execute, and this simply was not acceptable – btw, impatience is a virtue for a sales rep. Rajesh found that the existing database was the bottleneck, so he replaced it. With less than 20 hours of work, it was up and running in production with a 96.5 percent reduction in query times. Can you guess the impact this had? Yep, sales volumes increased significantly, Rajesh’s team became heroes and the execs were happy. And, since reps were more productive, they were also more satisfied and rep turnover was reduced.

 

3: Reliability, Availability and Scalability

In today’s 24x7x365 world, transaction system downtime is just not an option. An insurance company was having issues with performance, availability, reliability and scalability needed to support the company’s rapid growth of insurance applications. Replacing their database not only increased application availability from 80 to 95 percent, but they also saw a dramatic improvement in data processing times even after a 4x growth in the number of concurrent jobs … and, decreased their total cost of ownership by 50 percent. The company also saw customer satisfaction and stickiness improve.

These significant results happened because these clients upgraded their core database to IBM DB2. DB2 offers high performance support for both transactional processing and speed-of-thought analytics, providing the right foundation for today’s and tomorrow’s needs.

To learn more, watch our webinar.

Follow Larry on twtter at @larryheathcote

 

Join Larry and Lenny on a Tweet Chat on June 26 11 ET.  Join the conversation using #bigdatamgmt.  For the questions and more details see: http://bit.ly/Jun26TweetChat

Speed & Simplicity

larry

 Larry Heathcote , Program Director, IBM Data Management

I’ve been thinking a lot about the database and data warehousing markets lately. Like “shower moment” thinking – you know when you’re really passionate about something, you think about it in the shower. Well, for me, yesterday I had a “highway moment.”

I was driving down one of the major roads in Raleigh, NC on my way to have dinner with my wife. And traffic started to get a little heavy, so everyone had to slow down, and slow down, and slow down some more. We didn’t stop, but we were moving painfully slow. We were all going somewhere, just not very fast. Frustration set in.

And at that moment, it hit me. I felt just like Joe. Joe is a data warehousing architect I met a few weeks back when I was out on a speaking trip (Joe is not his real name). Joe and his team have been working on a really important project for the past couple months, but they’ve run into some performance challenges that may jeopardize their delivery date. Every day they were making progress, very slow progress. And Joe was getting frustrated. Just like me sitting in slow traffic.

Joe’s company manufactures and sells products through a number of brick-and-mortar stores as well as online. A big percentage of their sales are to repeat customers. And recently, a couple of their products had some quality issues – which swamped their call centers for a few weeks. The merchandising managers now wants to find out if the quality issues in one of their product lines was having a ripple affect to other product areas. So they asked Joe to give them the data and the analytics they need to gain this insight. And they wanted answers, like now!

Joe and his team recently migrated one of the company’s core customer databases onto a larger server and bought more storage. Then they integrated call center logs from their support centers, but they found that they were not getting the query performance they had expected. They fixed a few problems and cut query times significantly, but when stress testing the system they could just not get the response times that the merchandising managers wanted.

And that’s when it hit me for a second time…Joe’s challenges are not unique. Increasing data volumes, upgrading infrastructures, mixing in new data types and doing new types of analytics – there are a lot of companies going through this right now to satisfy increasing demands from line of business managers.

What Joe needs is this – he needs speed and simplicity – a next-generation database for the big data era. One that can handle all data types, SQL and NoSQL, transactional and mixed analytics workloads, take advantage of modern technologies like in-memory columnar processing and other performance accelerating techniques. And one that is easy and fast to set up, deploy and manage; one that would take fewer resources to manage, so Joe and his team could focus on additional innovative projects to support his business managers.

What Joe needs – is IBM DB2 with BLU Acceleration. View the infographic here to find out why.

DBMS_1

Introducing the IBM BLU Acceleration Hub

John Park-pic

John Park , Product Manager – DB2, BLU Acceleration for Cloud.

Hemingway once wrote “There is nothing to writing. All you do is sit down at the typerwriter and bleed” — he also wrote — “The best way to find out if you trust someone is to trust them.”

So when Susan Visser “pinged” me on IBM’s venerable Sametime system asking me to blog for the launch of ibmbluhub.com my immediate response was “I don’t blog, I only know how to post pictures of my pets and kid to facebook” she responded, “It’s easy, trust me”. Hence the quotes.

So here I am, and who am I? Well, my name is John Park. I am an IBM’er, I am a DB2 Product Manager and as of today, I am a blogger (?).

My IBM life has revolved around the DB2 and the analytics space, starting off as a developer building the engn_sqm component (think snapshot, event monitor and governor) – so if your stuff is broke – its probably my fault.

Then I moved into the honorable realm of product management, leading the charge on products such as Smart Analytics Systems, PureData for Operational Analytics and now BLU Acceleration for Cloud … which is why I guess I’m here.

On a personal note, I like to build stuff – specifically I like to build cool stuff, and BLU Acceleration is freak’in cool. When I think about the significance of this technology I recollect back to fixing Version 7 of DB2, building V8 and my last piece of code in v9.5. All along the way the DB2 team building features and products which helped our customers and our users, use DB2.

Personally, I see BLU as a convergence point, the pinnacle of where all the years of engineering and thought leadership have finally come to “eureka”.  Let me guide you in my thinking …

Autonomic features such as Automatic Maintenance, Self Tuning Memory Management and Automatic Workload Management were all incremental steps along DB2 Version releases, each fixed a problem the DB2 user had and improved the usage of DB2

DB2’s compression story started with row compression, index compression, then went to adaptive row compression and now to actionable compression and with each compression story a better value proposition to the DB2 user.  (Note that the word compression is used is 6 times!)

DB2’s performance optimization journey went from database partitioning and MPP to table and range partitioning, continuous ingest, multi-temp and workload management, making DB2 a leader in performance across all workloads.

Usability in its simplest form, value driven compression and unprecedented performance, are the three key tenets to the development of BLU. These features improved DB2 incrementally between versions, and as the product incrementally grew, our engineers experience and creativity expanded. With BLU we see these features and the knowledge gained from developing these features transform and support the simplest statement I have ever seen in enterprise software – “Create, Load and GO”. Simply amazing.

Welcome to the world, “ibmbluhub.com”, for the readers, enjoy the ride, this is the first step in a new direction for data management and technology. I haven’t even talked about BLU Acceleration for Cloud ….

Until then, here is a picture of my cat.

John Park

Is Your Database Ready for Big Data?

Chris eaton

Chris Eaton, Technical Specialist, IBM

A lot of people think of Big Data as simply Hadoop. But it’s so much more than that.

IBM, for example, has an entire Big Data Platform which covers analytics on data at rest like with Hadoop,  in-databases analytics, and analytics on data in motion with streaming analysis. One way to think about Big Data is that it is a “tool” to increase the IQ of your company. There is data that you already have in house and also data flowing through your corporate systems (much of which you simply throw away today) which could be leveraged to make your company smarter – Smarter about your system behavior (availability of your services to your clients for example), smarter about your clients buying behavior and even smarter about how much influence some of your clients currently have on others in their social circles (some of whom may not currently be your clients).

So where do databases fit in today and into the future?  Of course they are an integral part of the big data platform. Currently you store shiny gold nuggets of information in your relational databases. You have spent time transforming the data in your relational systems to make sure they have integrity and consistency. Most would consider that relational data has a high value per byte (compared to a Hadoop or a streaming solution wherein much of the data has low value per byte, but in aggregate is very valuable data).

So what’s on the horizon and is your database ready?  As more and more data is analyzed in real time, there is and will be a need to store this data in a richer format (shine it up, so to speak). This means  that there will be an increasing need for databases to ingest vast amounts of data in real time and on the other side for databases to be able to perform analysis of larger and large quantities of data in seconds, not minutes or hours.

You may have seen in DB2 10, a new INGEST utility to more efficiently handle continuous data feeding into the database. What about the query side? Well, certainly DB2 DPF can handle massive amounts of data today leveraging the power of large scale out clustering (much like Hadoop divides and conquers big data problems with lots of parallelism). But what about those cases where the amount of data isn’t so huge that I need a scale out solution but the analysis needs to be done in real time? There are lots of niche and not so niche vendors out there looking at in-memory techniques for speeding up queries; but most of them require the data to be sucked out of the database system and stored in repositories that are much less flexible (sometimes needing to store everything in-memory, which can be very inflexible).

It seems to me that this high speed in-memory analytics, especially for line of business type workloads will take off in the next several years faster than we have seen it in the past. Giving line of business users the power to analyze large quantities of data in real time without having to force IT build one-off repositories, I think, is only going to grow. The flexibility of in database merged with in-memory analytics will give IT the ability to leverage existing database assets and therefore build line of business solutions much faster.
So, keep your eyes on this space and make sure you’re on a path to exploit your existing databases and thereby make your company smarter.

To find out more about managing big data, join IBM for a free event: http://ibm.co/BigDataEvent