Learn about DB2 at Kolkata India DB2 user event

There is nothing more exciting than hearing how to revolutionize your business with DB2 for Linux, UNIX and Windows, so here is your chance to unlock the best practices and learn from the experts. I encourage you to weave this event into your busy schedule this week. I promise you won’t be disappointed!
Join technical experts form TCS Capegemini and IBM to learn how to maximize your IT opportunities using the keys to master the DB2 LUW locking
During this half-day agenda, you will learn how to make the right decisions for your current and future architecture:

There is NO REGISTRATION FEE to attend this Non-IBM event & the LUNCH on event day will be SPONSORED by IBM
When : 15th Nov 2014 (Saturday) from 9:30 Am to 3:30 PM
Venue : Techno India Campus,Salt Lake,Sector V, Kolkata, India
Who can join : Anyone who is having interest in DB2 or working on DB2
How to book your seat : Send a mail from your official mail id to kidug.india@gmail.com with subject line “I will attend”

Leading Speakers from : Capgemini, MJunction, TCS, IBM
KIDUG_Nov2014_DB2_Event_Invite_mailer

Mastering the DB2 10.1 Certification Exam – Part 2: Security

It’s hard to argue against the benefits of becoming a DB2 Certified Professional. Aside from gaining a better understanding of DB2, it helps keep you up to date with the latest versions of the product.  It also gives you professional credentials that you can put on your resume to show that you know what you say you know.

But many people are reluctant to put in the time and effort it takes to prepare for the exams. Some just don’t like taking tests, others don’t feel they have the time or money to prepare. That’s where we come in – the DB2 team has put together a great list of resources to help you conquer the certification exams.

We caught up with Anas Mosaad and Mohamed El-Bishbeashy who are part of the DB2 team to that developed the DB2 10.1 Fundamentals Certification Exam 610 Prep – a 6 part tutorial series aimed at helping DBA’s prepare for the certification exam.

What products are focused on in this tutorial?

In this tutorial we’ve focused completely on DB2 10.1 LUW

Tell us a little about what students can hope to learn about in this tutorial?

It is the second in a series of six tutorials designed to help you prepare for the DB2 Fundamentals Exam (610). It puts in your hands all the details needed to successfully pass security related question in the exam. It introduces the concepts of authentication, authorization, privileges, and roles as they relate to DB2 10.1. It also introduces granular access control and trusted contexts.

Why should a DBA be interested in this certification?

IBM professional certifications are recognized world wide, so you will get recognized! In addition, this one is the first milestone in the advanced DB2 certification paths (development, DBA and advanced DBA). It acknowledges that you are knowledgeable about the fundamental concepts of DB2 10.1. It shows that you have an in-depth knowledge of the basic to intermediate tasks required in day-to-day administration, basic SQL (Structured Query Language), understand which additional products are available with DB2 10.1, understand how to create databases and database objects, and have a basic knowledge of database security and transaction isolation.

Do have any special tips?

Absolutely, here are a few of our favorite tips for preparing for the certification exam:

  • Practice with DB2
  • If you don’t have access to DB2, download the fully functional DB2 Express-C for free
  • Read the whole tutorial before taking the exam
  • Be a friend of DB2 Knowledge Center (formerly infocenter)
  • When in doubt, don’t hesitate, post and collaborate in the forums.

For more information:

DB2 10.1 fundamentals certification exam 610 prep, Part 2: DB2 security

For the entire series of tutorials for Exam 610 DB2 Fundamentals, include the following:
Part 1: Planning
Part 2: DB2 security
Part 3: Working with databases and database objects
Part 4: Working with DB2 Data using SQL
Part 5: Working with tables, views, and indexes
Part 6: Data concurrency

About the authors:
Anas MosaadAnas Mosaad, a DB2 solutions migration consultant with IBM Egypt, has more than eight years of experience in the software development industry. He is a member of IBM’s Information Management Technology Ecosystem Team focusing on enabling and porting customer, business partner, and ISV solutions to the IBM Information Management portfolio, which includes DB2, Netezza, and BigInsights. Anas’ expertise includes portal and J2EE, database design, tuning, and database application development.

Mohamed El-BishbeashyMohamed El-Bishbeashy is an IM specialist for IBM Cairo Technology Development Center (C-TDC), Software Group. He has 12+ years of experience in the software development industry (8 of those are with IBM). His technical experience includes application and product development, DB2 administration, and persistence layer design and development. Mohamed is an IBM Certified Advanced DBA and IBM Certified Application Developer. He also has experience in other IM areas including PureData Systems for Analytics (Netezza), BigInsights, and InfoSphere Information server.

Balluff loves BLU Acceleration too

cassieBy Cassandra Desens
IBM Software Group, Information Management  

BLU Acceleration is a pretty darn exciting advancement in database technology. As a marketing professional, I can tell you why it’s cool..
BLU provides instant insight from real-time operation data,
BLU provides breakthrough performance without the constraints of other in-memory solutions,
BLU provides simplicity with a load-and-go setup,
etcetera, etcetera ..you get the point.

You can read our brochures and watch our videos to hear how DB2 with BLU Acceleration will transform your business. We think it’s the next best thing since sliced bread because we invented it. But is it all it’s cracked up to be? The answer is YES.

Clients all over the world are sharing how BLU Acceleration made a huge, positive difference to their business. Hearing customer stories puts our product claims into perspective. Success stories give us the ultimate answer to the elusive question “How does this relate to me and my business?”. Which is why I want to share with you one of our most recent stories: Balluff.

Balluff is a worldwide company with headquarters in Germany. They have over 50 years of sensor experience and are considered a world leader and one of the most efficient manufacturers of sensor technology.  Balluff relies on SAP solutions to manage their business, including SAP Business Warehouse for their data analysis and reporting.

Over the last few years Balluff experienced significant growth, which resulted in slowed data delivery. As Bernhard Herzog, Team Manager Information Technology SAP at Balluff put it “Without timely, accurate information we risked making poor investment decisions, and were unable to deliver the best possible service to our customers.”

The company sought a solution that would transform the speed and reliability of their information management system. They chose DB2 with BLU Acceleration to accelerate access to their enormous amount of data. With BLU Acceleration Balluff achieved:

  • Reduced reporting time for individual reports by up to 98%
  • Reduced backup data volumes by 30%
  • Batch mode data processing improvements by 25%
  • A swift transition with no customization needed; Balluff transferred 1.5 terabytes of data within 17 hours with no downtime

These improvements have a direct impact on their business. As Bernhard Herzog put it, “Today, sales staff have immediate access to real-time information about customer turnover and other important indicators. With faster access to key business data, sales managers at Balluff can gain a better overview, sales reps can improve customer service and the company can increase sales”.

Impressive, right? While you could argue it’s no sliced bread, it certainly is a technology that is revolutionizing reporting and analytics, and worth try. Click here for more information about DB2 with BLU Acceleration and to take it for a test drive.

_________________________________________________________________

For the full success story, click here to read the Balluff IBM Case Study
You can also click here to read Balluff’s success as told by ComputerWoche (Computer World Germany). Open in Google Chrome for a translation option.

Exclusive Opportunity to Influence IBM Product Usability: Looking for Participants for Usability Test Sessions – Data Warehousing and Analytics

Arno thumbnail 2By Arno C. Huang, CPE
Designer, IBM Information Management Design
IBM Design making the user the center of our productsThe IBM Design Team is seeking people with a variety of database, data warehousing and analytics backgrounds to participate in usability test sessions. We are currently looking for people who work in one of the following roles: DBA, Architect, Data Scientist, Business Analyst or Developer. As a test participant, you will provide your feedback about current or future designs we are considering, thus making an impact on the design of an IBM product and letting us know what is important to you.

Participating in a study typically consists of a web conference or on-site meeting scheduled around your availability. IBM will provide you with an honorarium for your participation. There are several upcoming sessions, so if you’re interested, we’ll help you find a session that best suits your schedule. If you are interested, please contact Arno C. Huang at achuang@us.ibm.com

Troubles Are Out of Reach With Instant Insights

RadhaBy Radha Gowda
Product Marketing Manager, DB2 and related offerings

Bet you have been hearing a lot about shadow tables in DB2 “Cancun Release” lately.  Umm… do shadow and Cancun remind you of On the beach by Cliff Richards and the Shadows?  Seriously, DB2 shadow tables can make you dance to a rock ‘n’ roll on the beach because you will be trouble free with real-time insights into your operations and of course, lots of free time.

What is a shadow table?

Shadow tables have been around since the beginning of modern computing – primarily for improving performance.  So what does the DB2 shadow table offer? The best of both OLTP and OLAP worlds!  You can now run your analytic reports directly in OLTP environment with better performance.

Typically organizations have separate OLTP and OLAP environments – either due to resource constraints or to ensure the best OLTP performance.   The front-end OLTP is characterized by very small, but high volume transactions. Indexes are created to improve performance.  In contrast, the back-end OLAP has long-running complex transactions that are relatively small in number. Indexes are created, but they may be different from OLTP indexes.  Of course, an ETL operation must transfer data from OLTP database to OLAP data mart/warehouse at time intervals that may vary from minutes to days.

DB2 can help you simplify your infrastructure and operations with shadow tables. Shadow table is a column organized copy of a row-organized table within the OLTP environment, and may include all or a subset of columns.  Because the table is column organized, you get the benefit of enhanced performance that BLU Acceleration provides for analytic queries.

How do shadow table work?

shadow tables

Shadow table is implemented as a materialized query table (MQT) that is maintained by replication. IBM InfoSphere Change Data Capture for DB2, available in advanced editions, maintains shadow tables through automatic and incremental synchronization of row-organized tables.

While all applications can access the row-organized table by default, DB2 optimizer will perform the latency-based routing to determine whether a query needs to be routed to shadow tables or the row-organized source.

A truly flexible and trouble-free OLTP world

Shadow tables offer the incredible speed you have come to expect from BLU Acceleration while the source tables remain row-organized to best suit OLTP operations.  In fact, with shadow tables, the performance of analytical queries can improve by 10x or more, with equal or greater transactional performance*.

With instant insight into “as it happens” data for all your questions and all the free time you’ll have with no more indexing/tuning, what’s not to like? Try DB2 today

* Based on internal IBM testing of sample transactional and analytic workloads by replacing 4 secondary analytical indexes in the transactional environment with BLU Shadow Tables. Performance improvement figures are cumulative of all queries in the workload. Individual results will vary depending on individual workloads, configurations and conditions.

Is Your Database a Hero or a Hindrance?

KellySchlamb Kelly Schlamb
DB2 pureScale and PureData Systems Specialist, IBM

Here’s a big question for you – Is your database a hero or a hindrance? In other words, is your database environment one that’s helping your organization meet your performance, scalability, and availability needs or is it holding you back from meeting your SLAs and keeping up with ever changing business needs?

Join me for an Information Week webinar on this topic next week — Thursday, September 4th at 12pm EDT — where I’ll be talking about these types of challenges faced by IT organizations and how DB2 has the capabilities to address those challenges.  News about some of these capabilities will be hot off the press and so you won’t want to miss it.

Click here to register

Webcast(Hero)-LIVE

Steps toward the Future: How IBM DB2 is changing the Game

toriTori McClellan
Super Awesome Social Media Intern

 

Welcome to the New Age of database technology!

IBM DB2 with BLU Acceleration changes the game for in-memory computing.  Due the importance of in-memory computing, we created a dedicated website to take you through all the details, references, and more: www.ibmbluhub.com !  This website is in place to help clients and prospects understand what next-gen in-memory computing can do for them and why IBM BLU is the ideal in-memory database to deliver fast answers.

A few examples of how IBM BLU has helped other clients find their ideal balance between speed and quality:

  1. Regulatory reporting is a huge challenge for all banks - Handelsbanken, one of the most profitable banks in the world, is currently doing reports monthly but are expected to do them daily in the near future. DB2 with BLU Acceleration has helped Handelsbanken analysts get the data they need for daily reports via columnar store. Learn more by watching this video: http://bit.ly/1u7urAA
  2.  Deploying DB2 with BLU Acceleration is simple - with only a handful of commands, you can turn on analytics mode, create a new or auto-configure an existing database to make best use of your hardware for analytics, and then load the data. Learn more from this IBM Redbook that introduces the concepts of DB2 with BLU Acceleration from ground up and describe the technologies that work hand-in-hand with BLU Acceleration: Architecting and Deploying IBM DB2 with BLU Acceleration in Your Analytical Environment.
  3.  Get the FACTS and stay current by subscribing to the ibmbluhub.com newsletter.

- IBM DB2 with BLU Acceleration is a revolutionary technology and delivers breakthrough performance improvements for analytic queries by using dynamic in-memory columnar technologies.

- Different from other vendor solutions, BLU Acceleration allows the unified computing of online transaction processing (OLTP) and analytics data inside a single database, therefore, removing barriers and accelerating results for users. With observed hundredfold improvement in query response time, BLU Acceleration provides a simple, fast, and easy-to-use solution for the needs of today’s organizations; quick access to business answers can be used to gain a competitive edge, lower costs, and more.

- Subscribe to the newsletter to continue learning about this hot in-memory database.  You will receive a periodic iNews email, which links to what’s new.  Just click and learn: http://www.ibmbluhub.com/blu-inews/

ToriBlog

If this information suits your needs, be sure to follow @IBM_DB2 on twitter. Get your information as it is being published.

How to Revolutionize Analytics with Next-Generation In-Memory Computing

lesking by Les King
Director, Big Data, Analytics and Database Solutions – Information Management, Software Group

 

We are now in the era of cognitive analytics. These are analytic processes that provide useful information with a timeliness which qualifies as “speed of thought”. More and more clients are leveraging the next generation of analytic computing to address business challenges which could never be handled before.

To understand this idea, here’s a fun video that explains this theory a little better and gives a real business example of exactly this: What do chicken dinners have to do with IBM?

As another example, just recently a few friends and I were looking for a coffee shop which had both WiFi and a table which was near a working power outlet. We were surprised to discover that a coffee shop in the area was analyzing the information from our mobile devices and was able to let us know that they had what we were looking for. Coffee shops are all over the place, but, that real time analytics and communication with us was what made the difference. The coffee shop doing this real-time analytics ended up getting our business.

What do the two business examples above have in common ? They both require the analysis of large volumes of information and to be able to take action on this information, very quickly. One of the key technologies allowing clients to accomplish this is in-memory computing. Hardware can handle an ever increasing volume of memory and processing power. There have also been amazing strides in the area of data compression. Vendors who provide the ability to analyze data, in memory, while compressed, will have a huge advantage with these analytic workloads.

An example of this would be IBM’s DB2 with BLU Acceleration. DB2 with BLU Acceleration provides an average of 10X ( 90% ) compression rates. This means 1 TB of data can be stored in about 100 GB of space. DB2 with BLU Acceleration stores data in memory in its compressed form, using less memory to store vast amounts of business data. More importantly, DB2 with BLU Acceleration can analyze this data while compressed. This combination of capabilities positions DB2 with BLU Acceleration as a key technology in the era of big data and cognitive analytics.

When you consider the business examples above, you can see the competitive advantage these companies will have. These next generation analytic infrastructures, which leverage in-memory computing, will allow these companies to grow their business and take clients from their competitors.

To hear another example of how this modernization of a company’s analytic infrastructure is helping solve real world business challenges, check out this upcoming webinar “How to Revolutionize Analytics with Next-Generation In-Memory Computing“, taking place on Sept 25 at 12:00 EDT .

Webcast-(InMemory)REVAMP

Tweetchat on Fraud Prevention in Banking

RadhaBy Radha Gowda
Product Marketing Manager, DB2 and related offerings

On August 7 ‘14, at 11 AM EDT, IBM Data Management team is privileged to have Robert L. Palmer, James Kobielus, and Wilson Davis join us on a tweetchat to share their expertise on #FraudPrevention in Banking.  Some topics that we shall be soliciting your opinion(s) on are:

  • Q1: Are fraudulent activities in banking increasing or decreasing? Why?
  • Q2: What are some key business impacts of fraud?
  • Q3: What measures can be taken to identify potential fraudulent transactions?
  • Q4: What analytics do you need to detect fraud?
  • Q5: What data sources can contribute to the analytics?
  • Q6: How can your systems analyze transactions as they occur?
  • Q7: How can new technologies such as in-memory analytics help in fraud detection?
  • Q8: Where can I learn more?

Here’s what you need to do to join our conversation to contribute or just listen:

  • Go to twubs.com or tweetdeck.com
  • Sign in with your twitter handle
  • Search on #FraudPrevention
  • A new window will open that makes it easy for you to follow and contribute.

If you plan to contribute to our tweetchat, please review the tips at slideshare since the chat can be very fast paced. Suggested resources relevant to the topic include:

  1. How to Mitigate Fraud and Cyber Threats with Big Data and Analytics
  2. IBM data management for banking
  3. Best practices to deploy IBM Banking Data Warehouse model to #IBMBLU for production
  4. Attract and retain customers with always-on digital mobile banking services
  5. Fight against fraud in real-time and save on operating expenses
  6. Customize offers to your clients with the data already at your fingertips
  7. World’s top 5 most secure bank is becoming more strategic and more profitable
  8. Regulatory reporting headaches? See how @Handelsbanken solved their reporting challenges

Tweetchat-AugustMore about our panelists:

Robert L. Palmer (@bigdatabusinessGlobal Banking Industry Marketing, Big Data, IBM

Bob’s expertise is applying B2B software to optimize key business processes.  He is a subject matter expert in financial services, and writes about business challenges, Big Data, analytics, CRM, cognitive computing, and information management.

James Kobielus   (@jameskobielus) Senior Program Director, Big Data Analytics, IBM

James is a popular speaker and thought leader in big data, Hadoop, enterprise data warehousing, advanced analytics, business intelligence, data management and next best action technologies.

Wilson Davis (@wilsondavisibm) Executive Technical Consultant – Counter Fraud iCoC, IBM

Wilson’s specialties include financial and operational data analytics, counter-fraud and anti-money laundering, straight-through-processing, and game changing improvements in business processes and application systems for the financial services industry.

The data advantage: Creating value into today’s digital world

IBM Institute for Business Value is looking to understand how organizations around the globe are creating business value from analytics. If you can spare a few minutes to participate in the survey, you’d be the first to receive a copy of the study when it is released in October 2014.  2014 Analytics Survey

Follow Radha on Twitter @rgowda

Data Analytics, or How Much Info for a Buck?

Bill Cole

Bill Cole – Competitive Sales Specialist,Information Management, IBM

Leave only footprints; take only pictures.  Have you seen that slogan in a national park?  My wife (she’s now an ex) didn’t believe the signs that told us to leave everything exactly where it was.  She didn’t want to just enjoy the beauty.  She wanted to take some home with us.  The flashing light of the Park Ranger car told me we were in trouble for picking up a few rocks along the side of the road.  The nice man in the Smokey hat told me to put the rocks back.  The scenery is for consumption with your eyes, your camera, not for taking home.  I did as instructed, happy to be leaving with my wallet in one piece.

I’ve always produced data and then turned it into information by adding other bits of data together and adding some context.  My users guided me for a while and then I both guided and pushed them.  This seemed to be the natural order of things, sort of like factories and the folks who buy the goods from those factories.

The IT/BI/DA teams accumulate and store the data and then massage to build what are essentially standard reports.  Standard reports are good for standard thinking, of course.  If you know the answer you’re looking for, a standard report probably has it in there somewhere, like those old balance sheets and ledgers that I ran so long ago.  But there was nothing in those reports that would help think outside of the data on those reports.  In fact, there was so little insight in them that one of the plant managers actually asked me what good these reports were.  There’s really not a good response to that one.

Insights are gained when the lines of business can chase an idea through all sorts of non-standard iterations.  Almost like chasing one of those happy mistakes from science, like penicillin, or those ubiquitous not-very-sticky note sheets that we all stick all over everything so we can easily keep track of passwords, etc.  LOL, like you haven’t done that.

So how do we get to this idea-chasing sort of thing?  This place where the data analysts or, better still, the line of business user can see something interesting and start chasing it?  This is custom-developed solution, a virtual pair of bespoke shoes that were for your situation and only for your situation.  The person in the next cubicle needn’t look over your shoulder.  It would do them no good after all.  There’s a scene in the Maureen O’Hara/John Wayne move “The Quiet Man” in which John asks directions and the local says “Do you see that road over there?  Don’t take it, it’ll do you no good.”  Insights are like that.  You need to know not to walk down a road that will do you no good.

The trick, it seems to me, is having the right tools.  Let’s start with the database (you know I’m a practicing DBA and that means all discussions start with the database).  DB2 BLU is exactly the right repository for your decision-making data.  After all, it offers both row- and column-oriented models in a single database!  This means you’re getting performance no matter which way your data chooses to be represented.  Moreover, there are different kinds of compression to ensure you save space and improve performance.  What could be better?  And all for the price of an upgrade!  Easy.  No-brainer.

There’s a neat coda to this, too.  You’re not confined to the old solution of finding a server, building it and installing the software, then building the database.  Let’s talk choices, folks.  Lots of choices.  Maybe every choice.  On premise, just like we’ve always done, works.  Maybe your own cloud would be better.  Build your BI/DA system in a PureFlex or PureApp or PureData cloud hosted in your own data center.  There’s a simple solution with lots of benefits including workload management.  Set it and forget it and go on about your business.  Maybe DBaaS works better.  Virtualize the workload and database in an existing private cloud to make use of those “excess” mips.  (Parkinson’s Law says that any organization grows to fill all the space available.  I think the demand for mips grows to fill the available servers, thus negating the concept of “Excess mips.”)  There’s SoftLayer for either a public or private cloud.  Remember, they’ll go all the way to bare metal if that’s what you need.  Finally, maybe best, is DB2 BLU available in the cloud. I championed this a while back and it’s now reality.  A pre-configured database that IBM manages and maintains, including backups and upgrades.  Talk about easy!  Go ahead, get some sleep.  We’ve got this one.

One last thought about the tools.  InfoSphere Analytics Server will do the analysis for you and present your users with suggested insights right out of the box.  And it will help the folks find their own insights by helping them look, filter and massage the data in any way that suits them.  It’s a cool tool for those times when you need the freedom to find your own way through the forest of data.

Finally, I’ve always kept two Robert Frost poems on my wall.  Perhaps, “Two Roads Diverged in a Yellow Wood” is the one for this post.  We in IT need to give the folks in the lines of business the right tools to chase down the new roads, new insights.  We’ll give the GPS for the roads less traveled by.  Good luck on your journeys of exploration!

The other poem is “Stopping By Woods On a Snowy Evening,” of course.  We all have miles to go before we sleep, before our work is complete, and using the right tools makes those miles ever so much more productive.  Bundle up on those snowy evenings and enjoy the ride.

Follow Bill Cole on Twitter : @billcole_ibm

Visit the IBM BLU HUB to learn more about the next gen in-memory database technology!

Follow

Get every new post delivered to your Inbox.

Join 39 other followers