Network with peers at the Informix Users Group Conference 2017

rajesh

 

 

by Rajesh Govindan, Portfolio Marketing Manager – Informix

When we asked David Link from Nebraska what he liked most about the Informix Users Group Conference in 2016, he told us that networking was high on his list.

“I had an excellent time,” he said, “The content was great, the people were great, the social events were great…and it’s a great place to network and meet with other people.”

In other words, the IIUG Conference was great!

If you want to network with other professionals while you improve your Informix skills and learn about upcoming product enhancements, you’ll want to attend the 2017 event.

The International Informix Users Group conference will be held at the Marriott City Center, April 23 to April 27, 2017 in Raleigh, North Carolina, US. It will feature three full days of educational sessions for Informix DBAs, developers and managers, plus social events where you can meet with others who’ve encountered and resolved issues that you may be dealing with in your job.

The IIUG conference is the world’s largest gathering of Informix users, and attracted over 400 professionals last year from throughout the world. Enthusiastic participants from Bangalore to Omaha came, learned, and shared at an event geared specifically to their work and their information needs.

If you’d like to know more about this year’s IIUG event, visit our web site. Those who register before January 31st will save their company money with an Early Bird Special that offers $375 off the regular fee.

Also, if you attended last year’s event –like David Link –we’d love to know what you liked most about it. Please leave a comment to tell others about your experience.

 

Simply Powerful and Flexible – IBM Informix on Cloud

pradeep

 

 

by Pradeep Muthalpuredathe

To make sense of the avalanche of data generated today, you will need to put the data to work. The more you do it, the better the outcome. But the cost of storing and managing the data on premise grows very fast along with growing infrastructure and operational cost. Add to that the cost and challenge of scaling your requirements to match your needs – and pretty soon it all starts getting overwhelming.

IBM recently launched the Informix on Cloud, a hosted cloud service on IBM Bluemix, running on IBM SoftLayer Cloud infrastructure – just to address these challenges. Available in configurations (sizes) to suit your every business need, IBM Informix on Cloud builds on the tried and tested capabilities of Informix Advanced Enterprise server. Besides the OLTP and OLAP capabilities of Informix, you can take advantage of industry leading technologies like TimeSeries, Spatial, NoSQL, along with the high-performance, being delivered at a very low total cost of ownership.

You can have your current on premises applications connect to Informix on Cloud and continue your current operations while getting the benefits of Cloud. You can also build your analytics applications using other Bluemix services which will integrate seamlessly with Informix on Cloud, thus delivering data visualization, executive dashboards, predictive analytics and streaming analytics capabilities, all on the same platform. The table below will help describe this better –

Synergy Area Use Cases Description
Predictive Analytics Predictive analytics solutions for different industry use cases, including manufacturing, transportation, insurance, weather related use cases Predict and plan maintenance requirements, eliminate unplanned downtimes, avoid accidents/disasters, by using historic data to look into models and patterns, to predict behavior, potential failures/outages, etc..
Watson Analytics Data visualization, dashboards and infographics, automate predictive analytics Gain and share insights from the data in the Informix repositories, trigger actions based on events and find answers to business questions
Spark Analytics Complex in-memory (lightning fast) analytics from hybrid data stored in Informix, and take advantage of machine learning capabilities Spark Analytics provide big-data analytics capabilities for deeper analytics for data in Informix on Cloud, real-time insights on streaming data to Informix
Dataworks Move data from multiple sources to the Informix on Cloud service, cleanse and visualize data, to improve quality of data and deliver accurate insights Access to data from a hybrid source, a proven way to move data to Informix on Cloud

 

Streaming Analytics Real-time processing of massive amounts of unstructured and streaming time-series data to  Informix on Cloud service for operational data stores. Real-time analysis on data-in-motion, combined with TimeSeries analytics with Informix on Cloud

 

One of the most frequent concern/question about hosting data in the cloud is about security and privacy. With Informix on Cloud, your data is encrypted with industry standard algorithm, both while at rest in the database and in flight between the database and client applications.

Informix is uniquely positioned as the database of choice for Internet of Things (IoT) solutions – with its native support for time series, spatial and NoSQL data types. And its zero administration requirements, auto configuration and autonomic capabilities make it deeply embeddable in IoT edge analytic solution on gateway devices at the edge of IoT. With Informix on Cloud, you can now build a complete end-to-end IoT solution that combines advanced analytics in the cloud with edge analytics. You can take advantage of Informix’s industry leading replication capabilities and its in-memory query acceleration with Informix Warehouse Accelerator (IWA), to build a complete solution.

Whether you are a DBA, Application developer, CTO or CIO of your company, Informix on Cloud provides a simple, easy and secure option with a very low total cost of ownership (TCO), that more than meets your data management and analytics requirements. For more information, check out the Informix on Cloud web site .

The value of common database tools and linked processes for Db2, DevOps, and Cloud

Michael

by Michael Connor, Analytics Offering Management

Today we released DB2 V11 for Linux, UNIX and Windows. The release includes updates to Data Server Manager (DSM) V2.1 and Data Server Driver connectivity V11 and Advanced Recovery Feature (ARF) V11.    As many of you may be aware of – 2 years ago we embarked on a strategy to completely rethink our tooling strategy.  The market was telling us we needed to focus more on a simplified user experience, a web console addressing both the power and casual user role, and deliver deep database support in support of production applications.  In  March 2015, we delivered our first iteration of Data Server Manager as part of 10.5.  This year we have yet again extended capability to this valuable platform and in addition extended support across a number of IBM Data stores including DB2, dashDB, DB2 on Cloud, and BigInsights.

First let’s talk about some of the drivers we hear related to Database Delivery.

  1. The LOB and LOB developer communities want access to mission critical data and extend that data through new customer facing OLTP applications.
  2. Business analysts are using more data than ever – in generating and enhancing customer value through Analytic applications.
  3. These new roles need on demand access to data across all aspects of the delivery lifecycle from idea inception to production delivery and support.
  4. While the timelines are lessened, the data expanded and the lifecycle speeded up, quality cannot suffer.

Therefore, the DBA, Development, Testing, and Production support roles are now participating in activities known as Continuous Delivery, Continuous Testing, and DevOps.  With the goal of improving customer service, decreasing cycle and delivery times, without decreasing quality.

DSM pic1Some areas that are addressed by our broader solutions for Continues Delivery, Testing, and DevOps include:

  • High Performance Unload of production data and selective data environment, including test data environment restore with DB2 Recovery Expert
  • Simplified test data management addressing discovery, subsetting, masking, and refresh with Test Data Management.
  • Automated driving of application test and performance based workloads with Rational Functional and Performance Tester.
  • Release Management and Deployment automation with Rational Urbancode.

And finally, areas improved with our latest DB2 releases

  • SQL Development and execution with Data Server Manager
  • Test and Deployment Data Server Monitoring with Data Server Manager
  • SQL capture and analysis with Data Server Manager
  • Client and application Data Access, Workload and Failover management with Data Server Drivers

DSM Pic 2The Benefits of considering a Continuous — Solution include reduced cycle times, lower risk of failure, improved application performance and reduced risk of downtime.

With the V11 Releases we have delivered enhancements including:

  • DSM: DB2 LUW V11 support  and monitoring improvements for PureScale applications, Extended Query history analysis
  • ARF: DB2 LUW V11 support and improvements for Analytics usage with BLU Acceleration
  • DS Driver (Also DB2 Connect): Manageability improvements, Performance enhancements, and extended driver support now for iMAC applications.

DSM Pic 3Many of the improvements noted above are also available for our private Cloud offering in preview DashDB Local – which leverages DSM as an integral component of their dashboard, and our public Cloud offering DB2 on Cloud.

Read more details about the announcement for further information:   http://www-01.ibm.com/common/ssi/ShowDoc.wss?docURL=/common/ssi/rep_ca/9/872/ENUSAP16-0139/index.html&lang=en&request_locale=en

Also check out the DB2 LUW Landing Page:  http://www.ibm.com/analytics/us/en/technology/db2/db2-linux-unix-windows.html

 

Blogger:    Michael Connor, with Analytics offering management, joined IBM in 2001 and has focused early in his IBM career on launching the z/OS Development Tooling business centered on Rational Developer for z.  Since moving to Analytics in 2013, Michael leads the team responsible for Core Database Tooling

Join Us For Our Next DB2 CrowdChat – DB2 Into The Future!

jason

By Jason Burns
ITSO Social Business Manager for DB2

In the 20 years since DB2 was launched on the Linux, Unix and Windows platforms, the product has seen amazing transformations! It is used by database professionals around the world and continues to evolve and increase in popularity.

As we look forward to the next 20 years, our clients and partners will continue to play an important role in shaping the future of DB2. What is it you like about DB2? What features are helping you achieve your goals: from the reliability and safety that is core to database technology, to the power of accessing any kind of data faster than ever before.

In this month’s CrowdChat, we’re assembling a who’s who of IBM DB2 professionals to talk about what you like about DB2 and how you envision its future. Panelists include: Sam Lightstone, Kelly Schlamb, Ember Crocks, Amit Patel, Lynn Chou, and Iqbal Goralwalla.

Questions for this month’s CrowdChat include:

• FP5 was just released and included new features. Could you tell us more about this?
• BLU Acceleration is available on new platforms (Windows and zLinux). How will this broadened platform support help you and your clients?
• What benefits will I see if I use BLU Shadow Tables?
• What is new with the pureScale feature?
• How easy is it for a DB2 DBA to manage a pureScale environment?
• Security is a top business concern! How does native encryption help improve security in business?
• Does native encryption vary for operational reporting vs transactional data?
• In the past, it seemed that special hardware was required to run pureScale. Is that still the case?
• What are some of the best use cases you have seen for DB2 with BLU Acceleration?

What is a CrowdChat?
A CrowdChat is a free community virtual event platform that works seamlessly with Facebook, Twitter, and LinkedIn to allow users to engage in an online conversation held at a pre-arranged time around a topic using a specific #hashtag. You can sign in using your Twitter, Facebook or LinkedIn account. It’s easy (and fun!) to participate in a CrowdChat.

Use your Twitter, LinkedIn or Facebook credentials to log into http://www.crowdchat.net. The chat can also be followed on Twitter using the hashtag #DB2. Anyone and everyone is welcome to join, especially the developer community!

Date: Thursday, January 22
Time: 1:00 PM ET
Link to Chat: http://www.CrowdChat.net/db2
Chat hashtag: #db2

DB2Night Show Recap – An In-Depth Look at DB2 10.5 “Cancun Release”

Fans of the DB2Night show were treated to a great program on Friday as IBM Champions Scott Hayes and Iqbal Goralwalla talked in-depth about DB2 10.5 “Cancun Release”.

Iqbal has been testing the release since it was available and shared a wealth of knowledge with the audience, including topics on combining transactional workloads with analytics, transactional workloads, BLU shadow tables, and recommendations on how registery variables need to be set in order to get the desired performance effects.

If you didn’t get the chance to join us Friday, you can catch a replay of the show on the DB2Night website by clicking here.  And you can also see a recap of our live tweets from the show on the IBM DB2 Storify.

Mastering the DB2 10.1 Certification Exam – Part 1: Planning

Norberto Gasparotto Filho is a database specialist with more than 10 years of experience with database administration and is the author of the DB2 10.1 fundamentals certification exam 610 prep, Part 1: Planning.

He was the winner of the first edition of “DB2’s Got Talent” contest in 2011. He also worked as programmer using a variety of technologies, and has certifications in both programming and database administration areas. In his blog (“Insights on DB2 LUW database admin, programming and more”), Norberto shares lessons learned in the day-to-day database administration work, tips and knowledge. During his spare time, Norberto likes to run, ride a bike and have fun with his kids and wife. Learn more in Norberto’s profile in the developerWorks community.

We caught up with him to ask him and asked him about the tutorial and what he feels are the most important takeaways for DB2 professionals looking to become certified:

Q) What products are discussed in this tutorial?
DB2 10.1 for Linux, Unix and Windows and some related tools like Data Studio – with mentions to DB2 for the z/OS world as well.

Q) What problem does it solve?
If you don’t have an entry-level certification in DB2, this tutorial should really help you! The Part 1 (Planning) tutorial covers everything in the Planning set of questions – so read it carefully! Along with information about the different distributions of DB2 and their features, you’ll know specifics about Data Studio, CLPPlus and other tools, notions about Data Warehouse/OLTP workloads and much more. Of course, you have to study all the tutorials in the series to be completely prepared for the test.

Q) Why should DB2 professionals be interested in getting certification?
When you need to buy something, don’t you prefer to have a product that has an endorsement? Like buying shoes or any sports goods, those that have ads with famous athletes endorsing usually catch your eye more easily than others, isn’t it?

The same happens to professionals and their certifications. When you have one or more certifications, companies that are looking for candidates will prefer YOU. With your DB2 certification, IBM will be endorsing you… How fancy is that? 🙂

Q) Do have any special tips?

  • Don’t limit your studying to just the tutorials. Nothing can replace hands-on experience.
  • Pay attention to which version of DB2 you are studying for and make sure it aligns with the test you are taking. We are talking about v10.1, even though v10.5 has already been released. The exam questions were written before new features of DB2 10.5 were introduced (and even before new features of latest fix packs of v10.1 were released).
  • When taking the exam, don’t spend much time analyzing or thinking about a specific question. If you are unsure, chose your answer (never leave it blank) and mark the question for review (then if you have time at the end, you can back to it and think more – maybe after answering other questions you’ll find tips for answering it).
  • After you’ve completed and passed the exam – let the world know about your accomplishment! Spread the word via social media (your LinkedIn profile, Twitter, Facebook, and of course, add it to your CV). IBM has a website where you can make public your list of certifications. Just go through ibm.com/certify (Member Site / My Profile / Public Profile Information). The information is updated automatically, and you can add your picture and more info about yourself – like links to your LinkedIn/Twitter profiles and more. Take a look at mine: http://www-304.ibm.com/certify/members/profiles.nsf/home?OpenForm&profile=filhon&tab=transcript

To see the entire tutorial, visit the link below:

DB2 10.1 fundamentals certification exam 610 prep, Part 1: Planning
http://www.ibm.com/developerworks/data/tutorials/db2-cert6101/index.html

Webinar: Why IBM DB2 Database Software Leaves the Competition in the Dust

We all know that IBM DB2 database software is an industry leader when it comes to performance, scale, and reliability. But how does it compare to the competition, specifically Oracle and SAPHANA?

IBM’s Chris Eaton joined IDUG’s DB2 Tech Talk to give an update on IBM DB2 and show how DB2 goes above and beyond our competitors to provide reliable functionality while saving businesses money.

During the presentation, Chris walked the audience through DB2’s latest release, DB2 10.5 “Cancun Release” and the four innovations that make BLU Acceleration  different from our competitors: Next Generation In-Memory, the ability to analyze compressed data, CPU Acceleration, and data skipping.

You can watch the entire presentation on the IDUG website by clicking here, and also review the Tweets from the event by logging on to the IBM DB2 Storify page here.

Still have additional questions? Feel free to leave them in the comment box below and we’ll get the answers to you shortly.

Chris eatonAbout Chris Eaton – Chris is a Worldwide Technical Sales Specialist for DB2 at IBM primarily focused on planning and strategy for DB2 on Linux, UNIX and Windows. Chris has been working with DB2 on the LUW platforms for over 21 years. From customer support to development manager to Externals Architect to Product Manager for DB2, Chris has spent his career listening to customers and working to make DB2 a better product. Chris is also the author of The High Availability Guide for DB2 , DB2 9 New Features, Break Free with DB2 9.7: A Tour of Cost-Slashing New Features and Understanding Big Data Analytics For Enterprise Class Hadoop And Streaming Data. Follow Chris on his blog here.

Multiple aggregations and the SQL table function: A case study

The DB2 Social media team caught up with Nattavut Sutyanyong and Kanishka Mandal who are part of the DB2 development team to talk to them about their article “Multiple aggregations and the SQL table function: A case study“. Here is what they has to say :

1) Why should someone read this article?

This article is useful to DBAs or SQL writers who design large DB2 queries that join several tables to create daily, weekly or  monthly reports for their business  analysis.  These kinds of queries are often used by data warehouse customers.

2) What problem does it solve?

This article helps in an environment where reports are generated with the same tables accessed multiple times.  So  it mainly focuses on data fetch redundancy and table data encapsulation.

3) What products are discussed in the paper?

Although the article focuses on the DB2 for LUW product, it can also be used  in warehousing queries.  The idea behind the case study can be applied in any DBMS that supports SQL.  The optimization and encapsulation techniques discussed in this paper are performed at the SQL language level: they are not specific to DB2 for LUW.

4) Do have any special tips?

The example shown in the article creates a common table which is used as a parent table and with which other tables can be joined to this table in order to avoid  redundancy  of data fetched.

Read the full article here.

About the authors:

Nattavut Sutyanyong

Nattavut has been working in SQL rewrite and optimization in DB2 LUW for 14 years. He has been helping numerous IBM customers and partners in tuning their SQL, improving the performance of the systems.

Nattavut Sutyanyong
DB2 Development – Query Compiler
Current project: UDX solution in dynamite
IBM Toronto Lab

Kanishka is working as DB2 LUW Worldwide advanced support analyst for 8 years.His current expertise are in High avaliablity in DB2, Purescale, And enhancing and benchmarking several customer environments.

Kanishka Mandal
DB2 LUW Advanced support
IBM ISL, Pune

Nothing Endures But Change – Face it With Confidence.

RadhaBy Radha Gowda
Technical Marketing, IBM Analytics

When faced with change, do you share Dilbert’s frustration (take a look at this Dilbert comic and you’ll see what we mean)?  Wait… don’t be fecklessly hopeless yet!  We understand that keeping up with competition and customer expectations in this constantly changing global economy requires you to continuously enhance products and services.   While change can bring in a wealth of new business opportunities, we also realize implementing these changes may cause a lot of grief, including production delays and deployment day disasters.

To put this in perspective, according to a survey from the Ponemon Institute that is sponsored by Emerson Network Power, the average cost of data center downtime across industries is $7,908 per minute (Survey-Infographic).

OWRFrom a data management perspective, we have a proposal to manage the change –  IBM InfoSphere Optim Workload  Replay.  This tool offers you the ability to capture actual production workload including workload concurrency, the order of SQL execution, all the input variables, and the workload characteristics that are needed to later replay the workload.  It even includes how long the statements ran, what was the SQL code that resulted etc.  You can then replay the captured workload in your pre-production environment and record the outcome.

This comprehensive set of inputs and outputs on both the original and the replayed versions lets you compare and verify if you are getting the same performance in your pre-production environment as you did earlier in production environment.  You can capture millions of SQL statements that run over a period of time in production and analyze how well they fare when replayed in a pre-production environment.

Some of the use cases where you may benefit from Optim Workload Replay are performance and stress testing, database upgrades/migration, on-going database maintenance, capacity planning, introducing new applications, platform consolidation, and periodic disaster recovery validation.

We invite you to check out IBM InfoSphere Optim Workload  Replay page and browse through the solution brief, white paper and more.

Change can be scary, but you now have a reason to smile.

I’m Your Big Data Problem: Integrate THIS!

RachelBlandby Rachel Bland
Senior Product Manager, IBM Business Analytics Growth Initiatives

Let’s take a look at my profile. Gen X-er. Young kid, busy job, work-life integrated. I will not answer your online surveys, I won’t even answer the phone unless I know who it is because if you really knew me you’d send a text. I don’t listen to voice mail – too disruptive.

I dare you to find me, understanding me, and approach me. The only information you are going to get is spread haphazardly all over the web. My likes and dislikes are out there if you’re looking.

Where is it all?  Amazon has a slice, a big slice of very relevant purchasing info, so does Zappos. UPS and Fedex visit my house every day. Fitbit and MyFitnessPal are my most frequently updated apps. Yahoo has my personal email, Gmail is my dumping ground for spammers. Facebook is my personal life, LinkedIn and Twitter have my professional life. Comcast provides my phone and internet service but keeps calling me looking for someone else.

Sure, I’m difficult, but I’m still a pretty attractive, albeit elusive consumer. I have a good income, a house that needs some work, a penchant for retail therapy, and a pretty decent appetite for convenience products and services. If you’ve got something I want, deliver free, and take returns by mail; there’s a good chance I’ll give you a chance. If only you could read my mind and figure out what I’ll buy next!

This is small example of the tremendous amounts of customer data available to businesses. How you get that data and what you do with it is what will separate the haves from the have nots. In order to truly know your customers and your market segment, you have to do the work.

The reality is that the possibilities to tap into new markets, identify innovative efficiencies and just run better and smoother are there; what’s also very real is the perception that the technical challenge is insurmountable. Well, not so much.  We’ve learned a lot in the past few years as the wealth of information from analysts and vendors demonstrates. The IBM Institute for Business Value has blue prints to help you identify the opportunities for value, and industry experts like Tony Curcio and Ray Wang from Constellation Research can provide you will best-practices for the steps along the way.

934839_10152281745660872_7673677487629968528_n

Spend an hour with us on September 11th and hear what experts Ray Wang and Tony Curcio have to say:

Successful Big Data Projects require Big Data Integration, since most Hadoop initiatives involve collecting, moving, transforming, cleansing, integrating, exploring, and analyzing volumes of disparate data.

Register for this webinar to learn about:

  • The current state of the Big Data market
  • Customer success stories with Big Data and Big Data Integration
  • The 5 best practices when it comes to Big Data Integration