|Latest 7 Posts
| Custom JSON Serialization With GSON|
Mon, Jan 23rd 2017 8
| Recapping 2016|
Mon, Jan 16th 2017 12
| Rebirth: An App of Ice and Fire|
Wed, Dec 14th 2016 5
| Scripting Server Upgrades|
Fri, Nov 11th 2016 7
| Everything Old is New Again|
Mon, Oct 24th 2016 5
| Git Squash|
Thu, Oct 20th 2016 7
| MWLUG Success|
Wed, Aug 24th 2016 10
| Building Java Objects From JSON|
Thu, Jan 22nd 2015 16
| Git History Searching|
Tue, Jul 12th 2016 12
| Recapping 2016|
Mon, Jan 16th 2017 12
| Blue Chalky Soup|
Mon, Feb 2nd 2015 11
| House Keeping|
Tue, Feb 23rd 2016 11
| Fixing Dojo 1.6.1 in Domino 8.5.3|
Tue, Sep 2nd 2014 10
| When You Need a Comparator|
Thu, Jan 8th 2015 10
| MWLUG Success|
Wed, Aug 24th 2016 10
| Server REST Consumption with Authentication|
Mon, Aug 18th 2014 9
| Redmine, CodeRay, and Domino, Oh My!|
Mon, Aug 11th 2014 9
||Using Node to Connect to an IBM i
This article is Part 2 in a 2-Part Series.
I’m back, this time with a spin on the base application I established in the last post. This flavor of things will utilize JDBC to connect to a DB2 table on an IBM i. This should work with anything the
jt400.jar can connect to, and should you swap out the
lib/jt400.jar for any other JDBC jar (PostgreSQL, Oracle, MySQL, etc.), the only changes you would need to make are to ensure your SQL queries/statements are valid for the data source.
As I break down the specifics for each implementation, you will find that I’m focusing on changing only a couple of files, the js files for the
config module and for the
util module are the biggest, with mere implementation concerns in the available
I recently stumbled across a long forgotten Facebook message (in the category of Facebook thought I didn’t want to see it) regarding a comment I had made on a developerWorks article, stemming from a micro-service I had written, to stand up a Node instance so I could consume data from our IBM i (iSeries/AS400/a power system by any other name) in a more RESTful/REST-like JSON API capacity.
The funny thing about the developerWorks article was that it eventually was refactored/updated to get around a dependency of a data server driver, which apparently is freely available for DB2 on other platforms than the IBM i (the strange things I had to learn at the time). In the end, I switched to using a jdbc package from npm, specifically the one titled ‘jdbc’ (at version 0.0.15), which subsequently underwent significant changes in their API format, meaning that I’m going to show a version with updated specifics using the ‘jdbc-pro’ package from npm.
The unique dependencies here are:
jt400.jar (which my
.gitignore is set to ignore
.jar files, so you’ll need to download and include your own copy of it; I’ m parking mine at the path of
- the npm package
jdbc-pro (which makes its main jdbc module accessible via a
require('jdbc') statement); install via
npm i -S jdbc-pro
Part of why you’ll need to download and include your own copy is, while I know the project to be open sourced from IBM under the IBM Public License 1.0, I’m just not familiar with that license. For those looking for one, the tldr legal page for IBMPL does a good job summarizing the conditions of the license.
Specifics With IBM i and JT400
The core connection component here is the
jt400.jar allowing for a JDBC connection. When I first implemented my initial version, I tried using the
jt400 package, which I ended up not using in favor of the
jdbc package due to some issues and preferences. Currently, the
jdbc package I used has moved on to a newer, major breaking API change, which I haven’t used, so the following will make use of the
jdbc-pro package from npm, which looks to be the one I would choose should I start it again, today (and is more consistent with my previous implementation).
Data Connection Config
For the config of connection, we’ll want to establish the url to the IBM i, the path to the
lib/jt400.jar, driver name (provided), and a valid user name and password. This is exported as an object, making it directly usable in the data handling (
util) module, it’s been pulled in via a
require statement. You can see that I’m again using a number of environment variables to assign things like the url property, or username or password. My example has fail-over values, but it’s best to keep those separate from the code base.
Next, I defined a common connection closing function, then two functions for the actual SQL statement handling; that’s because this package has two separate methods. One is for queries (e.g.-
SELECT * FROM ...) and one for updates (either
INSERT operations); per the npm package documentation.
I wrapped up those functions using a common function for both initializing a connection and for closing the jdbc connection. You’ll note that I’m approaching callback hell, but I’ve deliberately avoided this as much as possible by separating out my functions to invoke the passed callback functions and handle any potential errors generated. This can seem tedious, but when it comes to event driven operations, it’s best to write code that matches the async nature.
Lastly, I export those two primary functions as a JS object (as methods) to be called via their require statement in my
Now that my connections are configured and my data handling is provisioned, all I need to do is invoke it in my various
routes. As you can see from my data
util module, the exposed
query method is simple enough to use:
require the module
- call the
- passing in the SQL query and
- a function, which has two parameters, error or data
This way, if the error handle is null, it will not execute the error block, and vice versa. At this point, I hope the up-front modularizing is starting to show off its utility, as the implementation is about as simple as you can get.
The other available
routes all get updated as well, but they’re virtually identical, save for the specifics of the SQL query.
You can find my source code for this version of the project, in full, in the same GitHub repository as last time, just in the
As you can see, once we’ve configured and provisioned our connection, we can use it pretty easily. While wrapping up a bunch of queries to an RDBMS via JDBC can seem a bit silly, I have to say that the performance benchmarks I did at the time were quite impressive and the micro-service had the great benefit of being easily maintained outside my main production application, either by myself or another developer (who helped in setting up the SQL queries, due to their more intimate knowledge of the DB2 tables).
You can probably guess what’s coming up with the next post in this series. Once we connect, in a well structured way, to any data source, any data source becomes swappable by configuration. So tune in next time for an early, expanded example of the use and implementation of the domino-nsf package for access to a Notes/Domino file.
Apr 20, 2016
| Recent Blog Posts
Custom JSON Serialization With GSON|
Mon, Jan 23rd 2017 2:00p Eric McCormick
Mon, Jan 16th 2017 3:00p Eric McCormick
Per usual, I’ve had a little break between things and decided to catch up with a bit of a summary of some recent things that each didn’t necessitate their own post.
2017 IBM Champion
For starters, I’m honored to be named an IBM Champion in Collaboration Solutions (/ Social Business) for the third time. This would be a hat trick in (ice) hockey 🏒. I’m happy to be recognized with a group of people, developers and more, who are passionate about both their work and the plat
Rebirth: An App of Ice and Fire|
Wed, Dec 14th 2016 4:00p Eric McCormick
If you read my blog for any of the Saga of Servlets series, then I hope that you’re excited I’m returning to the application I put together for it. This time, it’s as a conversation piece in regards to some of the build process modernization I engaged in recently, in order to unify the code base in its git repository. In any case, it’s helping pave the way forward before I update some of the back-end elements, when it will again be a talking point for some additional rework and
Scripting Server Upgrades|
Fri, Nov 11th 2016 2:00p Eric McCormick
This one might be slight departure from my usual, but those that have followed my blogging this past year will have noticed a bit more of a leaning towards DevOps in some of my posts. This echoes a lot of what I’ve been concluding as increasingly a necessary part of development; that we need to consider a picture large enough to encompass the themes surrounding development functions and, like any good developer (DRY ~= “lazy”), automate the heck out of it.
I had p
Everything Old is New Again|
Mon, Oct 24th 2016 8:00p Eric McCormick
Every so often, it’s good to reassess one’s position. This is good from both a standpoint of being inquisitive and even interrogative, but when it comes to the ever changing landscape of the front-end development space, it’s not only inevitable, but must be embraced for what feels the need to “stay afloat”. I’m changing theme of my blog, hopefully for the better. The previous theme was good and did a great job of getting things started, but while I had forked a copy of a good
Thu, Oct 20th 2016 8:00a Eric McCormick
If you’re just here to learn a little about how to “squash” commits with git, skip down a ways. Otherwise, hold on, and I will catch you up on a couple of personal notes before we get there.
On the Blog
It’s been a little while since I blogged last. This has been due to a combination of reasons; specifically, I’ve been busy with:
my family, it was the end of summer with lots of things going on
a number of projects around the house (a deck removal and basement remodel
Wed, Aug 24th 2016 8:37a Eric McCormick
MWLUG was a great success as far as I’m concerned. Each time I’ve gone I’ve had the great enjoyment of being able to attend some high quality sessions, meet with lots of colleagues and friends from the community, and get a view into products and solutions many people are undertaking, over conversations and interactions outside of the sessions. This is always a great way of interacting with others who were able to make it. Unlike the IBM conference of Connect(EDsphere), this is purel
Manually Renewing HTTPS w/ Let's Encrypt|
Wed, Jul 27th 2016 10:40a Eric McCormick
A while back, I rolled a personal project, which is a Node app, to Bluemix for lightweight use. I managed to make use of Let’s Encrypt for the HTTPS certificate, but only after realizing that there was a bit of a manual aspect to it that is the antithesis of an automated script for such things. Ultimately, after finding some information in a blog post form Marky Roden (of all people), I was able to get moving. The only downside wound up being that time passed, and it came time to renew
Eric and the Quest for More Coffee, pt.2|
Fri, Jul 15th 2016 4:17p Eric McCormick
Posted in the “aside” category.
There were three submissions via the Google Form, and a couple more form messages via social media. Honestly, I had debated either a nondescript or far more overt mug w/ the likeness of one of the more iconic of H.P. Lovecraft’s imaginations, but this seemed a bit over the top.
a replacement for my alma matter
a Go Army, Beat Navy mug (which was never my thing)
this gem from shop.Scotch.io (again, pretty overt)
Git History Searching|
Tue, Jul 12th 2016 10:00a Eric McCormick
First, A Shout-Out
The recording of the session called “Normalizing XPages Web Development” that Shean P. McManus and I gave at the 2-day, virtual ICONUS (formerly IamLUG) event this year is now available from “Archive and Replays”. If you missed it, I recommend checking it out, it’s a great benefit of ICONUS and I hope that those who did get a chance to attend enjoyed the subject material. We covered a lot of ground and were able to demonstrate what is, in my opinion, one of the grea