Showing posts with label blu. Show all posts
Showing posts with label blu. Show all posts

Friday, December 16, 2016

New Mod Pack and Fix Pack for DB2 V11 available now

New DB2 Fix Pack and Mod Pack available
Over the past years I blogged about new fixpacks and features in DB2. This is the first time I have to mention "Mod Packs" because for DB2 V11 a new Mod Pack and Fix Pack was released. DB2 changed to distinguish between fixes to existing features and adding new features in interim releases. The terminology of version, release, modification and fix pack is explained in this support document.

If you are one of the administrators or developers impacted by system and code freezes over the last weeks of the year, then the good news is that you can use the time to explore some great enhancements to DB2. Check out the summary page in the DB2 Knowledge Center for an overview. Here are my favorites:
  • DB2 supports PKCS #11 keystores now, i.e., Hardware Security Modules (HSMs) can now be used with DB2 extending the choice to local keystores, centrally managed keystores and hardware modules.
  • Lots of improvements to DB2 BLU. If you are on Linux OS on IBM z Systems ("z/Linux") then you will be happy about DB2 BLU performance improvements on z13. There are improvements for other processors, too.
  • The Workload Manager (WLM) and related monitoring have been enhanced, giving deeper insight and more control of long running complex queries. There are also new features related to CPU management.


Sounds interesting? The updated version of DB2 can be downloaded from the usual support page offering all the fix packs (and now Mod Packs) for the recent versions of DB2.

Thursday, June 16, 2016

Now available: DB2 Version 11 for Linux, UNIX, and Windows

The new version 11.1 of DB2 for Linux, UNIX, and Windows (DB2 LUW) is now available. Enjoy many product improvements for analytic and OLTP scenarios. Here is how to get started:

With that, let's get started and have a save and successful journey with the newest DB2 version, DB2 11.1.

P.S.: Don't forget to try out IBM dashDB, the DB2-based cloud offering.

Wednesday, May 6, 2015

Using dashDB or DB2 with Apache OpenOffice (or LibreOffice)

Yesterday evening, I had some time on my own and didn't want to read. Why not surf in the Internet? That's why there are services like YouTube or Bluemix where you can spend hours without too much thinking...  :) I ended up fiddling with dashDB on Bluemix and building reports with my local OpenOffice. Here are the details of my evening activity.

Run query against dashDB in Excel
When I played with dashDB, a fully managed data warehouse service, I came across the "Run Query in Excel" button. That button shows up in the web-based query interface where you can compose SQL queries and execute them against the dashDB database. I got curious because my Linux machine only has Apache OpenOffice installed. ODC (Office Data Connection) files are not supported (yet) by OpenOffice and LibreOffice, but the programs offer the component "Base" as database (front-end) similar to Microsoft Access. So why not try to hook up the cloud-based dashDB and my local OpenOffice? This can be done using a JDBC connection.

Friday, December 12, 2014

New fixpack for DB2 10.5 brings in-memory analytics to Windows and zLinux

The new DB2 10.5 Fixpack 5 is available since today. A high-level overview of new features and enhancements can be found in the fixpack summary in the DB2 Knowledge Center. The list of all available DB2 fixpacks is available in the IBM Support Portal for DB2. There you will also find the links to download this new fixpack and a list of fixed bugs.

After this introduction I would like to point out two product enhancements that are included in this fixpack:

As you may know, "BLU Acceleration" is the technology codename for highly optimized in-memory analytics that is deeply integrated into the supported platforms. It is not just another column store, but optimizes the data flow from disk to the CPU registers to efficiently use the available processing power and memory resources. DB2 is also exploiting special CPU instruction sets, e.g., on the POWER platform, for faster data processing. With the fixpack 5 this technology is available now on Microsoft Windows and for Linux on zSeries.

Another feature enhancement is the new ability to specify which network interface cards (NICs) DB2 should use, if you have multiple. A new file nicbinding.cfg can be used to set up the bindings. If you had to deal with db2nodes.cfg before, then the syntax will look familiar.

That's all for my quick summary. Enjoy the weekend AND DB2.

Wednesday, December 3, 2014

Introduction and resources for migrating from Oracle to DB2

Well, a boring headline for an interesting topic. Originally I had planned to title today's blog entry "Forget about Black Friday and Cyber Monday - Save big by moving to DB2". But then writing this entry has dragged on for some days...
Database Conversion Workbench

In the past weeks I have been asked several times about how to migrate off Oracle and on to DB2. Let me give you a quick overview of the technical side, for the financial part you have to ask an IBM business partner or an IBM seller. Although you can move to DB2 from database systems like Microsoft SQL Server, MySQL, and others, I will focus on Oracle because of the compatibility features built into DB2.

When moving off Oracle this could be for a SAP system (or other vendor application) or other applications ("non-SAP"). For SAP environments and applications from several other software vendors there is a (kind of) standardized process to migrate a system. The reason is that there are database-specific definitions and feature exploitations. A great example is how SAP is making use of the DB2-only BLU Acceleration to boost performance for analytic environments. Many software vendors provide tools for database migration and related processes or services.

For custom scenarios where the application code is available, a manual migration applies. The traditional barrier to a migration, the (more or less) incompatibility of products, has been torn down by adding compatibility features to DB2. Some of those features come ready to use by any user, some require specific preparation of DB2 because they may impact the traditional handling/"look and feel". The DB2 Knowledge Center has a good introduction and overview into that topic: "DB2 Compatibility Features". If you are comping to DB2 with a background in Oracle then use the Terminology Mapping to discover how products, features, and commands are named in the world of DB2.

From release to release there have been several enhancements to the SQL compatibility with database vendors such as Oracle. An overview by release can be found in the related "What's New" section of each of the recent DB2 releases:
I have to point out that the features mentioned in the linked documents are only related to the SQL language, but that there have been several other features dedicated to making a transition from Oracle to DB2 as smooth as possible. Some of them are covered in the section "Application development enhancements":

If you prefer a book instead of the DB2 Knowledge Center, then I recommend the IBM Redbook "Oracle to DB2 Conversion Guide: Compatibility Made Easy". It gives an overview of DB2, the tools needed for a migration in a non-SAP environment, and the conversion approach. In the appending you will also find a nice terminology mapping, i.e., explaining how Oracle commands and features are named in the world of DB2.

A key tool for custom migrations is the Database Conversion Workbench (DCW). It is a plugin into the IBM Data Studio, a free tool for database administration, design, and SQL development. The DCW allows to analyze a database schema with respect to DB2 compatibility. The result is a valuable foundation for estimating the overall conversion effort. Once that is done (or not needed), the Database Conversion Workbench helps in the process of moving the database schema, database-side SQL packages, and thereafter the data from another database system to DB2. DCW also includes a package visualizer to help understand package dependencies which simplifies the code conversion. See this page for an overview and more resources around the Database Conversion Workbench.

An important DB2 feature related to the compatibility is the ability to run PL/SQL code. Several administrative PL/SQL packages ship with DB2 which can be found in the "built-in modules" section. Moreover, there are also some PL/SQL packages that can be used "as is" and are available from IBM developerWorks in the database conversion community: See here for the add-on PL/SQL packages.
That's all for now with my overview of resources for the Oracle to DB2 migration. I hope that it provides a good introduction into that (sometimes confusing) topic.

BTW: I have been covering many migration-related topics in my blog. Serge Rielau and now Rick Swagerman have provided many SQL tips for DB2 in their blog.

Monday, November 17, 2014

A quick look at dashDB and a happy SQuirreL

dashDB slogan on its website
This morning I took some time to take a look at dashDB, a new IBM DWaaS (Data Warehouse as a Service) offering. When you go to the dashDB website, you are offered two choices: Use the dashDB service available on IBM Bluemix or use a Cloudant account to add a warehouse to your JSON database. Let me give you a brief overview of what you can do with dashDB and how I connected a local (open source) SQuirreL SQL client to my new dashDB database.


Cloudant Warehousing (dashDB)
dashDB is a cloud-based analytics database ("analytics in a dash")) with roots in Netezza and DB2 with BLU Acceleration. Data is stored in table (rows and columns) format. It is ready to connect all kinds analytic tools, local or cloud-based, and is already set up for geo-spatial data analysis (instructions on how to use the ESRI ArcGIS Desktop are provided). The best is that your regular SQL database/analytic tools continue to work, see below for details.

dashDB: schema discovery
I started my journey by logging into my existing Cloudant account. There, on the dashboard menu is a new item "Warehousing". When clicking on the "New Warehouse" button, you can select the Cloudant databases that you want to import into the warehouse. Because multiple databases can be associated with a Cloudant account or a Bluemix Cloudant service, this step let's you pick the data of choice. After the source data is chosen, the dashDB database is created and so-called schema discovery turns the JSON documents into rows of tables. Thereafter, the data is ready to have analytics applied. That is the time to launch the dashDB control center, another so-called "dashboard".

The welcome screen shows some of the analytic options, e.g., the database is ready to be used with either Cognos, SPSS, InfoSphere DataStage, R scripts, or all of them and more:
Analytis for dashDB: Cognos, SPSS, DataStage, R
SQuirrel SQL client - dashDB connected
Because some time ago I already tested and blogged about a predecessor of dashDB (see here: how to set it up and how to use R), I was more interested in trying out a JDBC-based client with my new cloud-based data warehouse. Included as part of the dashboard are several sections that help you with the application setup. So it was easy for me to obtain the JDBC URL and configure it and the listed userid/password in my local SQuirrel SQL client (it will work in IBM Data Studio and the Optim tool, too). As you can see from the screenshot, the database connection from my laptop to the cloud-based dashDB succeeded. Ready for some SQL.

My lessons learned from testing database queries on the converted data (JSON to relational) will be part of another blog entry. Stay tuned...

Tuesday, September 2, 2014

New DB2 Cancun Release (Version 10.5 Fixpack 4) offers many enhancements

The fixpack 4 for DB2 10.5 is available since end of last week. Because it has an unusually long list of product enhancements and new features the fixpack even has the codename or nickname "Cancun Release". For those of you not too familiar with Northern American vacation culture, CancĂșn is a favorite vacation/tourist destination in Mexico, located at the Carribean Sea. So "Cancun Release" may suggest relaxation, recreation, and a dream come true because of ease of use, simplification and major performance enhancements for the DB2 in-memory database feature (BLU Acceleration), the broadened pureScale support, and other nice to haves.

A good start for approaching the new DB2 Cancun release is the fixpack summary in the Knowledge Center. It lists new features by category, my personal highlights are:
  • For the in-memory database support (referred to as "column-organized tables" and known as "BLU Acceleration" some bigger items include so-called shadow table to improve analytic queries in an OLTP environment, lifting of several DDL restrictions, and major performance improvement by adding CHAR and VARCHAR columns to the synopsis table. An in-memory database can be made highly available with the HADR feature.
  • DB2 pureScale clusters can be deployed in virtualized environments (VMware ESXi, KVM), on low-cost solutions without the RDMA requirement, and geographically dispersed cluster (2 data centers) can be implemented on AIX, Red Hat, SuSE with just RoCE as requirement.
  • As part of the SQL compatibility DB2 now supports string length definitions by characters, not just by bytes as before.
  • Installation of DB2 in so-called thin server instances.
  • A SECADM can enforce encryption of backups.
  • db2audit can be used to transfer audit records to syslog for simpler analyzation with, e.g., Splunk.
  • db2look has been improved to generate the CREATE DATABASE statement and export the configuration (see my earlier blog article on that db2look improvement in DB2 10.1).
  • Official support for POWER8.
I plan to blog about some of the new functionality over the next weeks. Until then you can take a look at the new items yourself. Fixpacks can be downloaded from this IBM support website. If you have an IBM Bluemix account or plan to create one, you can use the improved DB2 as part of the Bluemix Analytics Warehouse service. Check out my earlier post about how to set it up and connect to it using a local DB2CLP.

Last but not least: What is your favorite vacation destinations? Suggest new codenames as comment and don't forget new DB2 features you want to see...

Monday, September 1, 2014

What a plot: DB2, R, and Bluemix help with vacation weather

Last week I reported on how I set up a in-memory DB2 database on IBM Bluemix and loaded some historic weather data. Over the last couple days I used some spare time to play with the Cloud-based analytic capabilities that are provided as part of the Softlayer/Bluemix/DB2 combination. Most of the time went into learning (some basics of) R, an environment for statistical computing and graphics. As an example I wanted to find out what temperatures to expect for a possible September vacation on the German island of Norderney.

[Update 2014-11-04: The Analytics Warehouse service on Bluemix is now called dashDB]

For my small exercise I used data provided by the German Meteorological Service "Deutscher Wetterdienst". It allows to freely download and use (under some conditions) data from several decades of weather observation. I uploaded the data to DB2/Bluemix as described in my previous post.
Bluemix: Change DB2 column name and type
While playing with the data I noticed that the column names required escaping of quotes and the observation dates were stored as integer values (yyyymmdd). In a second upload I simplified the column names and adapted the column data type using the DB2 load wizard (see picture). Thereafter I was set for my experiments with R.

The DB2 Cloud environment provides several examples for programming in R, a special function library "bluR" to easily connect R with DB2-based data, and it features the RStudio to develop, test, and execute code in R. Within RStudio it is possible to execute several demos to learn more about analytics, graphing, and data processing. For the DB2 in-memory database API for R there is a demo as well. You can invoke it using the "demo(blur)" command:

DB2 API demo for R in RStudio
The demo shows how to connect to DB2, execute a query and use the fetched data for analytic processing in R. Last week I already tweeted about how I tested escaping of quote characters (use C style, not SQL style):



The data set which I uploaded to DB2 has daily minimum and maximum temperatures (and lots of other meteorological) for about 70 years. I used a SQL query and then the ggplot2 library to create a graphic. It shows the band for the minimum temperatures for each September day as well as the band for the maximum daily temperatures.
DB2 with R: Historic September temperatures
The code for this graphic is pretty simple (and I started last week looking at R and DB2) and available from my Github account:
1:  ########### R script to analyze historic weather data for min/max values  
2:  ## Written by Henrik Loeser  
3:  ## Connection handle con to BLU for Cloud data warehouse is provided already  
4:  ## For plotting, we are using ggplot2 package  
5:  ##   
6:  library(ggplot2)  
7:  library(bluR)  
8:    
9:  ## initialize DB2 connection and environment  
10:  con <- bluConnect("BLUDB","","")  
11:  bluAnalyticsInit(con)  
12:    
13:  ## query DB2 weather data and fetch min/max values of min/max values  
14:  ## (lower/upper boundary each)   
15:  query<-paste('select max(lufttemperatur_maximum) as maxmax,min(lufttemperatur_minimum) as minmin,min(lufttemperatur_maximum) as minmax,max(lufttemperatur_minimum) as maxmin,tag from (select lufttemperatur_maximum, lufttemperatur_minimum, day(mdatum) as tag from blu01023.klima where month(mdatum)=9) group by tag order by tag asc')   
16:  df <- bluQuery(query,as.is=F)  
17:    
18:  ## Some plotting needs to be done  
19:  jpeg(type='cairo',"tempe.jpg",width=800,height=600)   
20:  ggplot(df, aes(x = TAG))+ylab("Temperature")+xlab("Day")+          
21:     geom_ribbon(aes(ymin = MINMIN, ymax=MAXMIN), fill='blue')+  
22:     geom_ribbon(aes(ymin = MAXMAX, ymax=MINMAX), fill='red')+  
23:     geom_ribbon(aes(ymin = MAXMIN, ymax=MINMAX), fill='white')+  
24:     geom_line(aes(y = MINMIN), colour = 'black') +  
25:     geom_line(aes(y = MAXMIN), colour = 'black') +  
26:     geom_line(aes(y = MINMAX), colour = 'black') +  
27:     geom_line(aes(y = MAXMAX), colour = 'black')   
28:    
29:  sink('/dev/null')   
30:    
31:  bluClose(con)  
32:  ## connection is closed, we are done  


Pretty cool (my opinion)! I am already turning into a data scientist. And you can test this yourself on IBM Bluemix with the Analytics Warehouse service (DB2 in-memory database feature).



Monday, August 25, 2014

Setting up and using a DB2 in-memory database on IBM Bluemix

[Update 2014-11-04: The Analytics Warehouse service on Bluemix is now called dashDB.]
Last Friday I was on the way back from some customer visits. While traveling in a German highspeed train I used the Wifi service, connected to IBM Bluemix and created a DB2 in-memory database. Let me show you how I set it up, what you can do with it and how I am connecting to the cloud-based database from my laptop.


Unbound DB2 service on Bluemix
The first thing to know is that on Bluemix the DB2 in-memory database service is called IBM Analytics Warehouse. To create a database, you select "Add service" and leave it unbound if you want, i.e., it is not directly associated with any Bluemix application. That is ok because at this time we are only interested in the database. Once the service is added and the database itself created, you can lauch the administration console.

The console supports several administration and development tasks as show in the picture. It includes loading data, to develop analytic scripts in R, to execute queries and link the data with Microsoft Excel for processing in a spreadsheet, and it has a section to connect external tools or applications to the database.
Administration/development task in DB2 BLU console on Bluemix
One of the offered task is very interesting and I twittered about it on Friday, too:



You can set up replication from a Cloudant JSON database to DB2, so that the data stream is directly fed in for in-memory analyses. I didn't test it so far, but plan to do so with one of my other Bluemix projects.

A task that I used is to (up)load data. For this I took some historic weather data (planning ahead for a vacation location), let the load wizard extract the metadata to create a suitable data, and ran some queries.

Uploading data to DB2 on Bluemix

Specify new DB2 table and column names

For executing (simple) selects there is a "Run Query" dialogue. It allows to choose a table and columns and then generates a basic query skeleton. I looked into whether a specific German island had warm nights, i.e., a daily minimum temperature of over 20 degrees Celsius. Only 14 days out of several decades and thousands of data points qualified.







Last but not least, I connected my local DB2 installation and tools to the Bluemix/Softlayer-based instance. The "CATALOG TCPIP NODE" is needed t make the remote server and communication port known. Then the database is added. If you already have a database with the same name cataloged on the local system, it will give an error message as shown below. You can work around it by specifying an alias. So instead of calling the database BLUDB, I used BLUDB2. The final step was to connect to DB2 with BLU Acceleration in the cloud. And surprise, it uses a fixpack version that officially is not available yet for download...

DB:  => catalog tcpip node bluemix remote 50.97.xx.xxx server 50000
DB20000I  The CATALOG TCPIP NODE command completed successfully.
DB21056W  Directory changes may not be effective until the directory cache is
refreshed.
DB:  => catalog db bludb at node bluemix
SQL1005N  The database alias "bludb" already exists in either the local
database directory or system database directory.
DB:  => catalog db bludb as bludb2 at node bluemix
DB20000I  The CATALOG DATABASE command completed successfully.
DB21056W  Directory changes may not be effective until the directory cache is
refreshed.
DB:  => connect to bludb2 user blu01xxx
Enter current password for blu01xxx:

   Database Connection Information

 Database server        = DB2/LINUXX8664 10.5.4
 SQL authorization ID   = BLU01xxx
 Local database alias   = BLUDB2

I will plan to develop a simple application using the DB2 in-memory database (BLU Acceleration / Analytics Warehouse) and then write about it. Until then read more about IBM Bluemix in my other related blog entries.

Wednesday, August 13, 2014

Using some Workload Management for free in non-Advanced Editions of DB2

One of the new features of DB2 10.5 is BLU Acceleration. In introduces a couple of default Workload Management objects that are intended to control heavy queries running against column-organized tables. The objects are automatically created with every database, independent of the product edition. They are only enabled when DB2_WORKLOAD has been set to ANALYTICS before creating the database, i.e., a database for in-memory analytics is set up. But what is available for the regular guy like myself? What can be used for free and as foundation for some monitoring and understanding the system workload? Let's take a look.


Typically I use a DB2 Developer Edition which includes all features including WLM. So I removed the db2de license and organized (being IBMer has some benefits!) a Workgroup Server Edition (db2wse) which I added to the system using db2licm. I also turned on hard license enforcement, so that any attempts of using an unlicensend feature are directly blocked. Here is what db2licm returned thereafter:

mymachine> db2licm -l

Product name:                     "DB2 Workgroup Server Edition"
License type:                     "Authorized User Single Install"
Expiry date:                      "Permanent"
Product identifier:               "db2wse"
Version information:              "10.5"
Max amount of memory (GB):        "128"
Enforcement policy:               "Hard Stop"
Number of licensed authorized users: "25"

With that in place I created a new database named WLMTEST and connected to it. My first test was to create a workload object which should not be possible given my DB2 edition:

DB: WLMTEST => create workload freeride applname('xploit')
DB21034E  The command was processed as an SQL statement because it was not a
valid Command Line Processor command.  During SQL processing it returned:
SQL8029N  A valid license key was not found for the requested functionality.
Reference numbers: "".


Ok, this look right. I don't have a license to use DB2 WLM (Workload Manager). My next query was intended to check what service classes are present in my system.

DB: WLMTEST => select varchar(serviceclassname,30), varchar(parentserviceclassname,30), enabled from syscat.serviceclasses

1                              2                              ENABLED
------------------------------ ------------------------------ -------
SYSDEFAULTSUBCLASS             SYSDEFAULTSYSTEMCLASS          Y     
SYSDEFAULTSUBCLASS             SYSDEFAULTMAINTENANCECLASS     Y     
SYSDEFAULTSUBCLASS             SYSDEFAULTUSERCLASS            Y     
SYSDEFAULTMANAGEDSUBCLASS      SYSDEFAULTUSERCLASS            Y     
SYSDEFAULTSYSTEMCLASS          -                              Y     
SYSDEFAULTMAINTENANCECLASS     -                              Y     
SYSDEFAULTUSERCLASS            -                              Y     

  7 record(s) selected.


The DB2 Knowledge Center has an overview of related default WLM objects and which parts can be modified with DBADM or WLMADM authority. Having the names of the system objects I tried my luck altering a work class set to reduce the cost barrier for the managed heavy queries (SYSMANAGEDQUERIES):

DB: WLMTEST => alter work class set sysdefaultuserwcs alter work class SYSMANAGEDQUERIES for timeroncost from 1000   
DB20000I  The SQL command completed successfully.


The threshold SYSDEFAULTCONCURRENT defines how many of those queries can run concurrently in the system. Why not change that threshold definition?

DB: WLMTEST => alter threshold SYSDEFAULTCONCURRENT when sqlrowsreturned > 20 stop execution
DB21034E  The command was processed as an SQL statement because it was not a
valid Command Line Processor command.  During SQL processing it returned:
SQL4721N  The threshold "SYSDEFAULTCONCURRENT" cannot be created or altered 
(reason code = "7").  SQLSTATE=5U037


Well, it seems that you cannot modify the entire threshold to your liking. However, following the documentation on what can be done, I successfully reduced the number of parallel activities.

DB: WLMTEST => alter threshold SYSDEFAULTCONCURRENT when CONCURRENTDBCOORDACTIVITIES > 3 stop execution
DB20000I  The SQL command completed successfully.


To test the impact of my changes, I opened 4 different shells, connected to DB2 in each window, and more or less simultaneously executed the following query:

select * from syscat.tables,syscat.columns

I have to restate that I tried to execute it in all four windows. It only ran in three of them. Why? Because the threshold kicked in for this heavy query and stopped the execution for the 4th session ("concurrentdbcoordactivities> 3 stop execution"). So some basic workload management seems to work even without a license.

Can I change the threshold to force the application off, i.e., to not allow running the query?

DB: WLMTEST => alter threshold SYSDEFAULTCONCURRENT when CONCURRENTDBCOORDACTIVITIES > 2 force application
DB21034E  The command was processed as an SQL statement because it was not a
valid Command Line Processor command.  During SQL processing it returned:
SQL4721N  The threshold "SYSDEFAULTCONCURRENT" cannot be created or altered 
(reason code = "13").  SQLSTATE=5U037

No, changing the entire definition of the threshold is not possible, but at least parts of it can be modified. You can then use the adapted default WLM objects to better understand what work is running on your system, e.g., testing what would fall into the category of "heavy queries". As a last step, I used a monitoring function to return the CPU time spent by service subclass. Most was in the managed subclass into which my queries from above were mapped:

DB: WLMTEST => SELECT varchar(service_superclass_name,30) as service_superclass, varchar(service_subclass_name,30) as service_subclass, sum(total_cpu_time) as total_cpu, sum(app_rqsts_completed_total) as total_rqsts FROM TABLE(MON_GET_SERVICE_SUBCLASS('','',-2)) AS t GROUP BY service_superclass_name, service_subclass_name ORDER BY total_cpu desc

SERVICE_SUPERCLASS             SERVICE_SUBCLASS               TOTAL_CPU            TOTAL_RQSTS        
------------------------------ ------------------------------ -------------------- --------------------
SYSDEFAULTUSERCLASS            SYSDEFAULTSUBCLASS                          1207794                  552
SYSDEFAULTUSERCLASS            SYSDEFAULTMANAGEDSUBCLASS                    852547                    0
SYSDEFAULTMAINTENANCECLASS     SYSDEFAULTSUBCLASS                           466436                 1374
SYSDEFAULTSYSTEMCLASS          SYSDEFAULTSUBCLASS                                0                    0

  4 record(s) selected.


With that I leave more testing to you. Happy monitoring!

BTW: The same tests can also be done on the SQL DB service on IBM Bluemix. That service is a DB2 Enterprise Server Edition.

Wednesday, June 18, 2014

Video: Introduction to IBM Bluemix User Interface

Currently, I am taking a look at IBM's new Platform-as-as-Service (PaasS) offering code-named "Bluemix". Right now it is in open beta and I signed up on its official website http://bluemix.net. Bluemix is based on the open source Cloud Foundry. But how do you get started and get a good introduction to what is offered?

I read the article "What is IBM Bluemix?" on IBM developerWorks. It gives some background information and details that Bluemix offers Development Frameworks (develop in, e.g., Java, Ruby, or Node.js), Application Services (DB2, MongoDB, MySQL, and others), and Clouds, i.e., you can deploy your applications (apps) to public, private, or other clouds.

Next, I watched the following video which gives a nice overview about the different elements of the Bluemix user interface, including the dashboard.




Equipped with that background information and knowing about the UI, the next stop is the Bluemix website offering articles and sample code. One of the examples is on how to build a simple business intelligence (BI) service using Ruby and based on DB2 with BLU Acceleration, code samples included.

If you haven't tried it yet, you can still sign up for the free beta here: http://bluemix.net. Enjoy!

Tuesday, June 17, 2014

DB2 Screenshot Quiz: Where is this taken from?

I am using different DB2-related services, such as the new Knowledge Center for DB2, BLU for Cloud (DB2 with BLU Acceleration in the Cloud), IBM Bluemix, and of course a local DB2 installation. Where did I find the following graphic? It is part of one of the above mentioned services...






Let me know by comment or direct email.

Wednesday, May 28, 2014

With some magic through the cloud(s)

A couple of years back I was on a trip to Spain. During the taxi to the runway in Frankfurt, our aircraft was diverted to a parking position to get a technical problem fixed. So I grabbed my book, Harry Potter and the Deathly Hallows, and started to read from where I had left the day before. Horcrux by horcrux gets discovered and distroyed, the final duel is about to start when I notice that everybody around me is leaving the aircraft. I had to reconfirm with a look outside, but to my surprise we already arrived in Madrid. Without noticing I had been in and out the clouds, with some "magic" been taken from Germany to Spain.

A similar experience is possible today when using IT services. Usually you don't notice what is going on in the background. Web pages and their components, data and scripts, all could come from a "local" server or from "the cloud". In my recent post I showed how to sign up for DB2 with BLU Acceleration in the cloud and how to catalog a cloud-hosted database locally. Once the database is known locally, you can use the DB2 CLP, db2batch, and your favorite scripts to work with the database, even though it is located "somewhere".

But why would I use an analytics database in the cloud instead of locally? In his post "Cloud is the New Normal" Adam Ronthal answers this and also shares what other services were run locally before, like email or VoIP servers. If you are interested, but have questions about data security in the cloud, the Walid Rjaibi's series on "Data Security in the Cloud" (link is to part 6 with links to other parts provided on that page) is a good read.

That's enough for today. Looking up, I notice that most of my train ride is done. Time for the hotel and some sleep...

Wednesday, May 21, 2014

Clouds ahead: Playing with DB2 and Cognos (FREE)

By now everybody should have heard about IBM BLU Acceleration for Cloud. The tagline is "Data Warehousing and Analytics for Everyone" which caught my eye. So this morning I wanted to find out how easy it is to get started. My conclusion: Almost too easy for an IBM product... ;-)

First you have to visit the official BLU for Cloud website at http://bluforcloud.com/. There you click on the button "Try BLU Acceleration now" and you are taken to an overview of currently four different usage plans. The plan I chose is the free trial plan which is hosted on SoftLayer, but there are also metered plans available on SoftLayer or Amazon WebServices (AWS), and a managed service on IBM BlueMix.

After signing up by providing my Google ID (or alternatively name and email address), I was provided with my new account information within seconds and the system stated that everything was ready to go:
BLU Cloud account created successfully
After clicking on the "Start BLU Acceleration" link as shown in the screenshot above, the web-based managed console came up. It allows working with DB2 database objects, to query and analyze the data, and to run reports against the two sample databases. Of course you can upload your own data and try some of the analytic tools on them. My interest was Cognos and I tried the drill down reports:
Cognos drill down in BLU Cloud
In the graphical report based on the sample database you can click on the regions, product categories, etc. and then continue to subcategories and subregions ("drill down"). Of course you could try out Cognos or Industry Models with your own data.

What I did next is to explore how well the Cloud offering integrates with my local tools. So I opened a local shell on my Linux-based ThinkPad and launched the DB2 command line processor. First I added the Cloud-based DB2 server to my local directory using the CATALOG TCPIP NODE command. Next was to add the remote database using CATALOG DATABASE. Last I connected to the database providing the username and, when prompted, the password. Yeah, connected! That was easy!

Catalog the Cloud-based DB2 server and database on local machine
From start to finish it took me about 5 minutes for the signup process, logging into the Cloud service and adding the remote DB2 server to my local system. Have you tried it? It is free and fun. And I can tell my boss that I know my way around in the Cloud.

Monday, May 5, 2014

New DB2 BLU video - Analytics on IBM POWER: the game is changing

Do you like playing "Connect Four" (4 gewinnt, Four in a Row, Fire pÄ stribe, Connecta 4, ...)? Well, there is a new new video talking about DB2 with BLU Acceleration, SAP BW, and DB2 on POWER. And it has an ending that is typical for IBM :) The customer wins. See for yourself...


Friday, April 11, 2014

DB2 Quiz: Find the website for this screenshot

Today's DB2 quiz is not that technical, but it requires that you are up-to-date on IBM's offerings for DB2. What is the context for this screenshot? On which website did I take it?


Probably easy to solve for you guys. Enjoy the weekend.

Monday, March 10, 2014

Free trainings on using DB2 with BLU Acceleration

Many of you know that IBM is offering free trainings on Information Management products, so-called bootcamps, to business partners. Two new classes have been added that cover solely DB2 with BLU Acceleration:


The new technology is also covered in some existing bootcamps. The regular "DB2 for Linux, UNIX, Windows Bootcamp" has an overview. The former DB2 Performance Bootcamp is now called "Performance Monitoring and Tuning for DB2 10.5 with BLU Acceleration for Linux, UNIX, and Windows" and discusses best practices from a monitoring and tuning perspective for BLU.

Business Partners can sign up using the registration links on the individual bootcamp pages. Free product certifications are usually offered at the end of a training session.

Friday, February 21, 2014

Today I did NOT work. I worked, had fun, and learned a lot

My plans for today are to have a lot of fun and to learn. And it is real work, really. So what is my schedule for today?

Part of the agenda is watching The DB2Night Show with the current DB2's GOT TALENT Contest. Experienced DB2 users share their secrets, special tips & tricks, and lessons learned. So it is an hour worth spending. It helps build up additional skills and will make your boss happy. And it is fun.

Then I plan to download the poster for DB2 10.5. After printing and distributing it in my department, it is easy to explain that the BLU Acceleration technology is something deeply integrated into DB2.

Next on my list is to make sure that my membership is the new German DB2 User Group (DeDUG: Deutsche DB2 User Group) has been confirmed. It is a free registration. More news about the DeDUG is here.

Last, I wanted to check that the backups are available, just in case. The weekend is coming up and I plan to have fun - without any work-related stuff. In the video below, Rodney sings about why we should care for backups, why they are not a waste of money and time. He was a bootcamp participant this week and backups were a topic.

Have fun, learn, and enjoy the weekend after finishing off work.


Tuesday, February 18, 2014

DB2 10.5 Fixpack 3 is available

As can be seen on the "Download Fix Pack by version" page for DB2, the fixpack 3 for DB2 10.5 has arrived. It is a very interesting fixpack in that sense that there are no documentation updates at all, something that has not happened for a long time with an early fixpack. Thus, it is really a fix pack and it has an important TechNote attached for customers with column-organized tables.

May all your transactions commit...


Saturday, February 8, 2014

Family life and DB2 BLU

Imagine that you had to search for a cooking pot within your house. Where would you start and search first? Most people would focus on the kitchen. Where would you look for some toothpaste? Most probably in the bathroom and maybe in the room where you just put the bag from your shopping trip.

Using the context information speeds up search, you are only considering some places and avoid searching the entire house. This is data skipping in normal life. DB2 with BLU Acceleration uses a synopsis table to provide the context information. By avoiding work less resources are needed, less data needs to be processed and you have the result much faster.

Now imagine that the cabinets are labeled, the kids would have cleaned up their room with clothes nicely folded and small junk sorted into plastic containers. In DB2 BLU this would be called "scan-friendly". Some people use "space bags", plastic wraps that can be vacuumed to reduce the storage size of clothes, pillows, etc. Because you can still see what is inside and handle it like everything else, it is "actionable compression" - same as in DB2 BLU which can operate on compressed data.

Now if I could create an analogy how DB2 BLU does the dishes - something I have to do now. Household chores. Enjoy the weekend!