Showing posts with label fun. Show all posts
Showing posts with label fun. Show all posts

Tuesday, February 20, 2024

Spreadsheets: How to excel with Db2 data

Generated chart in Excel file
Recently, I had to produce a spreadsheet from database data. One naive way is to export the data to a CSV file, then import the data from that file into the spreadsheet. Another option was to quickly script a small Python program that fetches the data and directly generates a Microsoft Excel file. As a bonus, I added even a line chart (as shown).

Wednesday, November 22, 2023

Unicode string length, code points, and Db2

Byte length of (Unicode) strings
After my recent blog post "πŸŽƒ Unicode characters and Db2 πŸ•Έ️ 🏚️", I had some follow-up discussions. One was around how to determine the Unicode UTF-8 byte length of strings in a non-Unicode Db2 database. There were solutions proposed that required data export to analyze the data externally or to implement some functions or procedures. I insisted that there is an SQL-based solution. Here is my proposal.

Monday, October 30, 2023

πŸŽƒ Unicode characters and Db2 πŸ•Έ️ 🏚️

A smiley query in Db2
Recently, I had a discussion about Unicode characters and Db2. Since Db2 LUW version 9.5, new databases default to the Unicode code page. Instead of having the entire database in the Unicode code page, you can specify a CCSID (coded character set identifier) for either individual columns or the whole table when creating the table (isn't that 😱?). Our discussion was not around emojis (πŸ˜€) and Halloween (πŸŽƒ), but "business" - the Euro sign (€). How can you insert and retrieve Unicode characters when you know their code points? Let's take a look and have some fun...

Tuesday, January 10, 2023

Generate PowerPoint slides from your Db2 data with Python

Generated PowerPoint slide with Db2 data
How did you spend the holidays? I had some days off and then used the last day of vacation for some fun programming. Last year, I wanted to test the python-pptx package, but couldn't find time. So I used my "fun day" to generate / code up some PowerPoint slides. To make it more interesting, I added some charts from data retrieved from Db2 in my IBM Cloud account. Here is a quick recipe.

Friday, November 25, 2022

Finally together: Db2 and Zeppelin

United: Db2 and Zeppelin
If you followed my blog, you may have noticed that I wrote about Db2 and about Zeppelins in the past - but not together. Today, I am going to discuss how I configured a JDBC interpreter in an Apache Zeppelin notebook to connect to a Db2 on Cloud database. So, finally, within a single blog post, I can talk about both of them. Let's get started.

Monday, November 7, 2022

IDUG 2022 EMEA conference is over - keep it going

"IDUG 2023 EMEA will be in"
The IDUG 2022 EMEA Db2 Tech Conference is over. What a week it was! Hard work, not much sleep, technical education, in-person networking, fun, and much more. After getting home, I needed two days and especially the nights to recharge, then the last week to reflect. Here are my random thoughts...

Wednesday, March 16, 2022

From Bluemix to IBM Cloud, from Cloud Foundry to Code Engine

"Bring Your Own Community"
About seven years ago, I started to work with, then blog about Bluemix and Cloud Foundry. Not my first, but one of the first posts is titled "Some fun with Bluemix, Cloud Foundry, Python, JSON and the Weather". Reading that article again I feel nostalgic and it brings back memories of how I learned to deploy my apps to Cloud Foundry. And how I had fun with new cloud technologies.

Wednesday, November 24, 2021

Rate-limit Kafka event generation with kcat and bash

Traffic for event streams
Recently, I worked with IBM Cloud Event Streams which is a message bus built with Apache Kafka. I was looking for a simple command-line tool to test my Event Streams instance and to stream access logs into it. That's when I ran into kcat (formerly known as kafkacat). It is a generic command line Kafka producer and consumer and easy to install - just use a Docker image. All worked well, I could even read a file of historic Apache access logs and, line by line, send them over. But I still faced the issue of controlling how much to send, how to throttle it. I solved it using a bash script.

Thursday, November 18, 2021

On serverless data scraping, cloud object storage, MinIO and rclone

Building a data lake the serverless way
This fall, I started another side project. It involves mobility data and its analysis and visualization. As first step and still ongoing, I am building up a data lake (and maybe integrating it into a data fabric). For that, I regularly download data from various sources, then upload it to a Cloud Object Storage/S3 bucket. Once in cloud storage, I process it further, often involving tools like the MinIO client and rclone. Let me give you a quick overview and links to some more background reading...

Friday, September 3, 2021

Serverless Twitter Bot using IBM Cloud

A Twitterbot at work
Some months ago, I discussed cron-like scheduling on IBM Cloud. One of my personal use cases is to send out tweets. In this post, I am going to look into details, into how I implemented a Twitter bot, deployed it to IBM Cloud Code Engine, and how it is managed.

Tuesday, June 15, 2021

Quickly deploy the serverless cloud mailer using Terraform

In the era of instant messaging we all still receive emails. They are used for status updates, security alerts or just for proposing really great offers. Recently, I blogged about how to have the IBM Cloud Security Advisor send out alerts using your SMTP-based email delivery service. Later, I made the solution core, a serverless action available as separate project "cloudmailer" on GitHub and blogged about it: A Serverless Function for Sending Emails on IBM Cloud. Continuing this side project, I now added Terraform support. Thus, using "terraform apply" you can now automatically deploy everything including the SMTP configuration. See the instructions in code repository for details.

Thursday, May 13, 2021

Wireshark with Lua on RHEL / CentOS

Wireshark with Lua-based dissector

What do you do on a rainy public holiday with COVID19 restrictions in place? Finally get Wireshark to work with Lua support to have custom dissectors. Dissectors are useful to turn binary garbage into readable TCP or UDP packet content. Lua is a scripting language and a supported way of adding dissectors in Wireshark. Unfortunately, the install package for Red Hat Enterprise Linux does not include Lua support. Compiling Wireshark on my RHEL 8.3 does not simply work because it requires Lua version 5.2 for my scripts to work. And RHEL either has version 5.3 or 5.1 which both are incompatible (long story). So, let's get going.

Monday, December 28, 2020

OBS on Linux: Green screen and virtual camera for video conferencing

OBS Studio: My monkey enjoys the beach
Similar to many of you, part of my work and hobbies consists of video conferencing. For some time now, I have been using OBS Studio (Open Broadcaster Software) to create a virtual camera on my Linux system. Recently, I had to upgrade my kernel. It required to recompile some file and reminded me that I wanted to blog about it. As usual, this is how I remember all the interesting stuff. So what is needed to create a virtual camera with OBS Studio and can you use a green screen for some beach feeling like shown?

Wednesday, September 30, 2020

Use alfaview on rpm-based Linux (Fedora, Red Hat, CentOs)


Recently, I tried to prepare for an alfaview session. alfaview is a video conferencing system and used by the university where I teach data security. Only earlier this year alfaview introduced Linux support, and only for Debian-based systems. My system is rpm-based (Red Hat Enterprise Linux / Fedora / CentOS), so what to do? A tool like alien did not work for me. Here is what I did to make alfaview run on my rpm-based Linux system.

Friday, February 28, 2020

Swashbooking for crowd-sourced book reviews and fun

Books for review
Usually, I don't go to book clubs or write book reviews. But yesterday evening was different with my first swashbooking session (German: Buchstrudeln). It is fast-paced book skimming and crowd-sourced book review combined. And a lot of fun. So what is it and what really did we do? Read on...

Friday, October 18, 2019

My passwordless app on IBM Cloud thanks to FIDO2

Passwordless login for cloud app
In my recent post I discussed how I could use a FIDO2 dongle as second factor for an app on IBM Cloud. Today, I want to give you an update because I managed to go passwordless. With the latest October update Cloud Identity started to offer passwordless login with either FIDO2 or QR code (using the IBM Verify app). I put that to a quick test for my secure file storage app. Here is what I did to go passwordless.

Friday, May 3, 2019

Your chatbot with Watson Discovery News

Some months back I introduced you to a barebone news chatbot. Today, with the updated tutorial to build a database-driven chatbot in place, I want to show you how to easily combine Watson Assistant with Watson Discovery. Watson Assistant already provides steps to deploy an integrated search skill which is based on Watson Discovery. My approach is similar to the database integration: Deploy a cloud function and invoke it from the dialog.

Friday, February 8, 2019

Startup lessons from a Fuckup Night

Last Wednesday, I attended the Fuckup Night Friedrichshafen Vol. II. If you don't know, Fuckup Nights is a global movement and event series dedicated to professional failures. That is, usually founders of failed startups tell their stories. Typically, it is a mix of funny adventures into the world of business, some sad parts and most importantly some lessons learned. So what were the lessons I took away? Read on...

Monday, October 22, 2018

Automated reports with IBM Cloud Functions, Db2 and Slack

GitHub Traffic Analytics
One of my (many) favorite IBM Cloud solution tutorials is about combining serverless and Cloud Foundry for data retrieval and analytics. I blogged about it and described how an automated IBM Cloud Functions action retrieves GitHub statistics and stores them in Db2. Using an embedded Cognos dashboard and regular Javascript / HTML tables, the solution offers GitHub Traffic Analytics. I extended that solution by automatic weekly reports that are posted to Slack.

Saturday, October 6, 2018

Impressions from Zeppelin flight

Zeppelin flight
Recently, I had the opportunity to fly on a Zeppelin NT, the kind of Zeppelin I had blogged about before. The 6+ hour flight was a once in a lifetime opportunity because it could not be booked. It took as from Bonn Hangelar (EDKB) to Friedrichshafen (EDNY). Our journey started with a detour over Cologne, then following the Rhine up to Karlsruhe, taking a turn to Stuttgart and from there down south to Lake Constance (see the rough route we took on the right).