Showing posts with label optim. Show all posts
Showing posts with label optim. Show all posts

Wednesday, August 7, 2013

IOQWT - A typical IBM acronym that works for DB2 BLU

IOQWT is a typical acronym, used to deal with lengthy product names like IBM InfoSphere Optim Query Workload Tuner for DB2 for Linux, UNIX, and Windows. It is part of the InfoSphere Optim suite of tools for data lifecycle management. IOQWT usually helps to tune single queries or entire workloads. Some licenses for the tool are included in Advanced Editions and the Developer Edition of DB2. That is how I got my fingers on it. And the reason I had to try out using IOQWT is that the so-called Workload Table Organization Advisor (WTOA...?) can predict whether it makes sense to convert tables from row organization to a column-organized layout, i.e., whether DB2 with BLU Acceleration is a big benefit.

My journey started by starting up Data Studio with IOQWT integration. In the so-called Data Source Explorer I had to activate my database for tuning. After the license had been applied and additional internal-use tables been created, I could start tuning, i.e., invoke the tuning advisors. The first was to tell IOQWT what statements to analyze. It allows to directly paste a single query as text, however, it only recommends table organizations for a workload, a set of queries. Thus I chose to import a file with three queries. My test database uses a TPC-H schema and I imported 3 queries.

After saving the 3 queries to a workload, IOQWT was ready for action. As next step I could select what types of advisors it should run and what type of tuning suggestions I was interested in. After marking Table Organization as the kind of advice I proceeded to the next step, waiting for results. :)

As you can see below, IOQWT analyzed six tables from my small TPC-H database. If I would convert tables to column organization (BLU Acceleration), it predicted a performance improvement of 98.79%. The most gain would be for the first query which right now has the most costs and would have least costs associated after the conversion. So DB2 with BLU Acceleration seems to make sense for my workload and by analyzing it in IOQWT I got that information without converting my database.

That's it for today, see older posts on DB2 with BLU Acceleration.
Results from Optim Query Workload Tuner suggesting DB2 BLU

Wednesday, May 15, 2013

Optim Query Capture and Replay for system tests

One of the frequent questions during bootcamps is about what tools are available for performing tests, especially driving workloads against a DB2 database. In an older article I pointed you to the Workload Multiuser Driver (WMD) and to the TPoX-included workload driver, both free, open source, and from IBM-initiated projects. Today, I wanted to point you to a tool that is a spin-off of InfoSphere Guardium, the so-called InfoSphere Optim Query Capture and Replay (OQCR).

What the tool does is simple to describe. It captures (or logs) the workload or traffic to one database and is then able to replay it to a different system, i.e., to send the queries it grabbed to a database. That way it is possible to test out systems under realistic "stress" before putting them into production. Optim Query Capture and Replay is able to manage workloads. With that statements or transactions can be removed from a workload or they can be copied into others. Users and schemas can be mapped, the speed of how quickly workloads or queries are replayed can be changed and much more.

The best way to learn more about Optim Query Capture and Replay is to either start with the OQCR Information Center or by glimpsing over the documents available at the so-called information roadmap. The tool can also be used together with the Optim Test Data Management solution which allows cloning of production databases for tests, including applying or handling data privacy rules (masking data).

Friday, November 16, 2012

Where to find information about granted privileges in DB2

Yesterday, I reveived a question about where to find specific metadata in DB2. The customer wanted to find out which privileges had been granted within a database and they were aware that db2look can produce this list. But where does this information come from?

Let's start with a glimpse at db2look. It is the "DB2 statistics and DDL extraction tool" and can be used to produce the DDL statements for the objects inside a database. There are also options specific to handling authorizations, i.e., the privileges (see the -x, -xdep, -xd, and -xddep options). All the statements that db2look produces are based on special data stored in the database, the so-called metadata. Those tables that hold the metadata are called System Catalog in DB2 and Data Dictionary in Oracle. The system catalog than can be queried using regular SELECT statements because the information is provided in tables (users can stay within the data mode, the relation model).

DB2 offers the metadata in two different sets of views. The views in the SYSCAT schema basically have all the metadata. The views in the SYSSTAT schema have a subset of the data and have (some) updatable columns, so that object statistics can be changed (for good or for worse...). The views are built on top of the internal catalog tables which are managed in the SYSIBM schema. It is advised to only use the SYSCAT and SYSSTAT views because structure of the internal tables can change without warning whereas the external catalog views are kept intact.

Now, where can the information about granted privileges be found? The DB2 Information Center has a so-called "Road map to catalog views" which is a listing of all the offered metadata found in the SYSCAT and SYSSTAT views. All the views that end in *AUTH carry authorization information, i.e., data about privileges. To give some examples, in SYSCAT.INDEXAUTH you can find out who has CONTROL privilege on an index, SYSCAT.ROLEAUTH is useful to see who has ADMIN authority for granting a specific role, and finally, as a complex example, SYSCAT.TABLEAUTH manages all the table privileges like insert, update, delete, select, alter, control, and even more privileges.

Administration tools for DB2 access that information, procedures and scripts can select data from these views, and of course applications and users. Which of these views are made available for the PUBLIC is up to the administrators. Remember, it's an honor, not a privilege...

Comments or questions?

Wednesday, March 28, 2012

Today is Document Freedom Day

Both for IBM business and for private I receive many documents. In the most cases I can open and process (read, print, or edit) them, but a good chunk has trouble displaying correctly or even opening them. This happened on my Windows machine because of incompatible versions of Microsoft Office, this happens these days because of proprietary document formats that are not fully supported on my Linux machine. A lot of the problems could be avoided by making people more aware of the issues and pointing them solutions that are portable and work across the different systems (Windows PC, Apple iOS, Linux, UNIX machines, iPad, Android devices, etc.).

So I was happy this morning when I heard about the Document Freedom Day that is celebrated today. Its purpose is to raise awareness about open standards. It is about making sure you can still access and open existing documents in some years from now on and about exchanging information so that everyone can process it.

BTW: IBM offers under the InfoSphere Optim brand solutions for Application Retirement, so that in some years you can still access and find data old, offline data.

Wednesday, November 17, 2010

One editor, many tools: The SQL and XQuery editor has been enhanced

Earlier this month a new article "Creating scripts more efficiently in the SQL and XQuery editor" has been published on developerWorks. Some of us are switching from the deprecated DB2 Control Center to the Eclipse-based tooling. Good to know that the same SQL editor is part of Data Studio, InfoSphere Data Architect, and several Optim tools.
The articles describes how man new editor features can be used to efficiently develop SQL scripts or XQuery scripts. It is also a good introduction into how to use the statement editor. What is often done for editing XML-related queries, namely changing the statement terminator, is described as well.

Tuesday, July 14, 2009

Job vs. Life: The importance of tools and ROI

When you have a new home or a new garden - or both - one of your spare time jobs is to install things, assemble furniture, remove weeds, plant and water, etc. Usually you have a choice of really manually performing a tasks, e.g., using a regular screwdriver, or trying to simplify and speed up what you want to accomplish, e.g., using a power screwdriver. For most tasks you have the choice of using mostly already present tools or investing into new power tools or similar stuff. The questions are always, how much time and effort can I save, how often will I use the power tool again (and sometimes even: if I buy the tool, will my significant other actually do it instead of me)?

Often, the decision is to buy a tool and to be happy about the simplifications and the saved time. The actual problem is to select the right brand and version of tool.

When you are running database applications the situation is similar. There are many tasks that can be performed by a skilled person without tools or just with whatever the core software (OS, DBMS, JDK, etc.) offers. However, often an additional tool can help to save time and improve the quality of what needed to be done. Similar questions as above for the house and garden need to be answered, only thereafter it can become more confusing when shopping for the right tool [Did you happen to select a power screwdriver that was renamed/rebranded a couple times?]. IBM offers a wealth of tools, sometimes for every problem even two, ranging from application development over data life cycle management and data encryption to performance monitoring.

Going back to house and garden, I usually buy tools from the same brand/company because they have the same "look and feel", once I dealt with one tool, it's quicker to learn how to handle the next. BTW: That's the same principle behind the Optim suite of tools for Integrated Data Management.