193 Lotus blogs updated hourly. Who will post next? Home | Blogs | Search | About 
Latest 7 Posts
Auto-width Bootstrap Column XPages Controls
Tue, Mar 21st 2017 73
Swiper FP8 Version Beta Release
Thu, Mar 16th 2017 23
Swiper FP8 Integration Rollout
Wed, Mar 8th 2017 9
Markdown XPages UIControl
Sun, Mar 5th 2017 12
TextDiff XPages control – For visual comparison of text
Tue, Feb 28th 2017 10
XPages webmail – Using Mime Inspector to debug Mime
Tue, Feb 14th 2017 9
Pasting Images into XPages CKEditor
Sun, Feb 12th 2017 17
Top 10
Auto-width Bootstrap Column XPages Controls
Tue, Mar 21st 2017 73
Swiper FP8 Version Beta Release
Thu, Mar 16th 2017 23
Pasting Images into XPages CKEditor
Sun, Feb 12th 2017 17
Markdown XPages UIControl
Sun, Mar 5th 2017 12
TextDiff XPages control – For visual comparison of text
Tue, Feb 28th 2017 10
Uploading Plugins Headlessly to Open Eclipse Update Site
Sat, Jul 18th 2015 9
Preventing pasting of remotely hosted images in CKEditor
Mon, Nov 14th 2016 9
XPages webmail – Using Mime Inspector to debug Mime
Tue, Feb 14th 2017 9
Swiper FP8 Integration Rollout
Wed, Mar 8th 2017 9
Domino Debug Plugin - Fix for Eclipse Luna
Tue, Sep 23rd 2014 8

Build System for XPages and OSGi plugins
Twitter Google+ Facebook LinkedIn Addthis Email Gmail Flipboard Reddit Tumblr WhatsApp StumbleUpon Yammer Evernote Delicious

We have only just recently set up a Build System for XPages / NSF / OSGi plugins. Here is a little summary of how it started, roughly how it is set up and a few random notes that might help others hoping to do a similar thing. It's not perfect, but at least you know you are not alone if you are trying to do something like this! Any questions or suggested improvements please comment!

How did it start? 

Continuous Integration / Deployment, has long seemed out of reach for Domino developers.
I saw a post on OpenNTF last year asking if anyone was interested a project for an XPages build ecosystem. I thought that would be great! I think I posted an "I'm interested" comment and then I forgot about it. 
Then I noticed a tweet from Martin Jinoch

I thought that Martin was playing around with something of his own creation! At the time I did not realise it was an IBM 9.0.1 feature. I then saw this from Martin Pradny...
And then this from Egor Margineanu...
Then I was going through some slides from Connect 14 and found some slides from a session that outlined this new feature. It was only then I fully understood it was a proper IBM feature for 9.0.1!

At this point I was very interested but all of this still seemed like a far off dream to me. Our team's workload has increased due to a developer leaving, we are months behind on a project and I am scrambling for any spare time I can get with 2 small children keeping me busy.

Also, we are developing our own OSGi plugins. We have been building and deploying these manually since we still didn't know how to automatically build these yet. So there would be no point investigating continuous integration without being able to automatically build the OSGi plugins. I'd had a little look into Maven, I wasn't sure how it would fit with the way we currently develop or if it would play nice with the Domino Debug Plugin (from XPages SDK) which we heavily rely on for developing plugins. (if interested in this, see also Paul Withers' post regarding jvm for ExtLib development)

Then, I was looking into modifying the Drag n Drop sidebar to suit a project I'm working on. I watched Ryan Baxter's series on plugin development, and in it he mentioned PDE Build for automatic building of plugins. So, I investigated PDE Build and found that I could successfully build our OSGi plugins using headless Eclipse! This meant that I now had most of the pieces of the puzzle, one important piece missing was Time!!!

I still needed to successfully build an NSF. I managed to do a quick test of a simple On-Disk project to NSF build, this was successful! I fired off a couple of questions I had to the IBM email addresses at the end of the slide deck. Jonathan Roche answered them for me within the same day and included a link to the developer wiki for Headless designer .

The Business Need

The key stake-holder of our current project had been making regular visits to check on our progress. Unfortunately he often shows up without warning, when our own development system is usually in a broken state. We have a user acceptance testing environment, but deploying to it is a manual process. We are already short of time, so we rarely update it unless we actually are deploying to production, so the testing environment is pretty useless I guess.

Well his frustration at not being able to test the system was reaching it's boiling point, we agreed it would be worthwhile to put aside the project temporarily to investigate a continuous deployment of our latest stable branch, to a testing environment. With this agreement, the final piece of the puzzle Time was found! A few days and late nights later we have a working Continuous Integration system for XPages and OSGi plugins.

The System as it currently stands

Please Note, setting up this build system was not trivial, and this post is not a 'How-To' post, but more of a 'It's possible if you want to' post. The system is working well for us and for the most part is ticking along, however if the Domino Designer NSF build breaks it sometimes needs a kick along manually.

So as an overview of how our system currently works.
Developers' develop OSGi plugins and NSF's on their own machine with a local domino server. Developers use their own feature branch (we are using git-flow) of the Git Repository.
When code is ready for testing, developers merge into 'develop' and push to GitHub.
Jenkins monitors Github for changes to the develop branch, when it sees changes, it runs a build.
When we set up the Jenkins job we gave it a 'Build Script', a list of steps that we want it to do. We have used Apache Ant to define our steps. There are other options but we are using ant.
If the build script runs successfully is successful it deploys it to our testing environment.
For production releases (master branch), there is no automatic deployment, just the build step, which produces either an NSF or some plugins, these are still deployed manually however there is no reason they couldn't be automatically deployed too, but this is just how we have it set up at the moment.

Here is the software used in the build system:
  • Github for hosting code repositories
  • Build server (is also our Testing Environment Domino Server)
    • Windows 2003 Server
    • Domino 9.0.1
    • Notes 9.0.1
    • Designer 9.0.1 
    • Eclipse Kepler (for building OSGi plugins), but technically you could use Designer as well
    • Git 1.9.0 for Windows
    • Jenkins 1.5
    • Java JDK 7u51
    • Apache Ant 1.9.3
    • Powershell 1.0 to run Egor Margineanu's Powershell script 
I also added the Notes Program directory to the PATH environment variable so that you can run 'designer.exe' or 'designer' from any directory. (For launching headless designer)
    We already had a Windows 2003 Server set up with a Domino instance for our Testing Environment, so we decided to use this as the build server.

    Building and Deploying

    I will describe Building and Deploying separately. 
    A Build task is purely concerned with taking some Source Code  (Java code, or an NSF On-Disk Project) and turning it into an artifact which can be deployed. 
    A deployment task is concerned with taking an artifact that has been successfully built, and putting in in use somewhere, be it a testing or live environment. At present we are only automatically deploying to our testing environment.

    General overview of build process

    Here is a rough description of what happens in the build process. You create a Job in Jenkins which defines:
    • Instructions on when to run the build: automatically when new changes? manually? periodically?
    • Where to get the latest source code from?
    • what are the steps taking to build? copy files? compile something?
    • what do you want to do at the end? archive the results? notify somebody?
    Each Job has a 'workspace' directory of it's own where most of the operations take place. The source code is checked out to here, and at the end of the build, you archive whatever artifacts that you built (e.g. NSF file or plugins) for safe keeping.
    Each time you run a build you get a reference number for that attempted build, which is sequential.

    Starting a Build

    You can start a build job Manually, you can schedule build job, you can have build job start when another finishes etc. You can trigger a Build with a REST Call

    With the github plugin, you can have your Jenkins server check GitHub for changes by Polling it periodically, however you can also configure it so GitHub will notify your jenkins server when there is new code to build. This is what we have done, it requires your build server to be externally accessible though.

    Overview of Building an NSF

    Building an NSF involves preparing an On-Disk Project, and then launching Domino Designer in what is referred to as 'Headless' mode. It is not truly Headless however, headless usually means that there is no GUI, but in this case the Domino Designer GUI is still launched but the window is minimized soon afterwards. Designer then carries on with importing the On-Disk Project and building the NSF and then shutting down.

    For example if we had are building an NSF for our 'Discussion' NSF with build number 15.
    1. Jenkins fetches the latest version of the repository from Github
    2. Jenkins checks out the branch that you are building (e.g. develop)
    3. Jenkins runs our Ant target 'buildNSF' Which does the following
      1. Copies our on-disk project to a temporary 'WORKSPACE/odp_15' directory. We rename the folder to include the build number to be doubly sure that designer will have never seen this project folder before, otherwise it may think it is the same project as previously attempted.
      2. Copies the following files to the On-Disk Project, which set the desired Application Properties for the NSF to be Built.
        This allows us to ignore them from the repository, which means each developer can put whatever they want on their database.properties and it won't affect the build.
        1. database.properties
        2. $DBIcon
        3. IconNote
      3. Builds the NSF Discussion_15.nsf into Notes Data Directory, using headless designer by running the powershell script.
      4. Jenkins then Moves the successfully built NSF Discussion_15.nsf from the Notes Data Directory to our Jenkins Job workspace.
      5. As a post-build step, Jenkins archives our successfully built Discussion_15.nsf file

    Overview of Building an OSGi plugin

    To perform an OSGi plugin build we use Headless Eclipse, and this time it truly is Headless, no GUI involved, command line only. It uses PDE Build, which is the exact same set of code that Eclipse uses when you build your plugins within Eclipse.
    As an overview, you provide PDE Build with a 'feature' to build, the feature can point to one or more plugin projects. You also specify build.properties to PDE Build which define the target Java Runtime environment, the target OSGi platform (e.g. Domino Plugins plus any plugins your project depends on), compiler flags such as whether to include debugging information
    1. Jenkins fetches the latest version of the repository from Github
    2. Jenkins checks out the branch that you are building
    3. Jenkins runs our Ant target 'buildPlugins' 
      1. creates a sub directory 'buildDirectory' in the jenkins job workspace with sub folders for features and plugins. this is the directory PDE Build will use as it's working directory.
      2. copies the plugins to be built to the 'buildDirectoryplugins' directory
      3. copies the feature which defines the plugins to be built, to the 'buildDirectoryfeatures' directory
      4. runs the PDE build script to build osgi plugins (results in a zip file)
      5. unzips the plugins and features to the Jenkins Job Workspace
      6. Copies the plugins and features from the Jenkins Job Workspace to a permanent eclipse update site folder.
      7. generates a new site.xml in the update site directory
      8. as a post-build step archives the zip file of the built plugins as build artifacts.

    Overview of Deploying an NSF to Testing Server

    There is more than way to do this I suppose! It all depends on what your template inheritance structure might be.

    At the moment it is very simple for us. It is simply a design replace, and no template inheritance needs to be set. For example if we are updating the testing database TestingDiscussion.nsf
    1. Copy the built NSF e.g. Discussion_15.nsf from where Jenkins archived it, to the Domino data directory, to a folder where we will keep templates e.g. 'C:DominoDataTestTemplatesDiscussion_15.nsf'
    2. run nconvert.exe -d TestingDiscussion.nsf * TestTemplatesDiscussion_15.nsf
      to update our Testing version of Discussion to the latest version
    This is not a very flexible solution, it kind of depends on the Domino Server being on the same machine as the Build server.
    More complicated versions would involve moving the nsf to another machine, setting template inheritance, replicating, running design refresh etc.
    I have written a small helper deployHelper.exe with the Notes C Api which does template name setting and can send console commands to servers. I have recently discovered Java Native Access so I plan to re-write this helper exe in Java, it could even be made into a Jenkins Plugin I guess :)
    In any case I think deploying the NSF could be a whole set of blog posts, it is 11pm here so I will leave for another day :)

    Overview of Deploying Plugins to Testing Server

    Deploying the plugins to the testing server, currently Jenkins simply moves the plugins and features into the <DominoDataDir>dominoworkspaceapplicationseclipse directory.
    It then uses my deployHelper.exe which sends a 'restart task http' console command to the server.

    For deploying plugins to production, we still do this manually. I simply have a mapped drive letter (U:) to the updatesites folder on the build server (this is where Jenkins dumps the successfully built plugins) I think open the production UpdateSite.nsf and import from local update site, U:develop

    Other options / ideas for automatic deployment, check out the Open Eclipse Update Site project on OpenNTF. Karsten Lehmann modified the normal Update Site so it too can import plugins headlessly. It does it via an agent.  You could then use my deployHelper to restart server!


    So you can see there is quite a lot involved in setting up a build server! This post really is just an overview. I am happy to do some more detailed posts on specific parts so please post a comment / ask a question.
    It is quite a bit of effort, and yet another learning curve.
    The benefits of doing it, is that you know the build process will be the same every time, and it is as simple as pushing a branch to GitHub.
    Returning to our business stakeholder mentioned earlier, now instead of saying
    'Why is it always broken?' 
    he says
    "Did you push the latest changes?" 
    and we are a little bit less stressed :)

    p.s. I originally drafted this post a few months ago, Here are some random notes I wrote I thought I would leave them in!

    Ant Notes

    I haven't really used Ant before, but I really, really like it now.
    If you haven't used it before, basically you define a set of 'targets' which define tasks such as moving files, running programs, running java. A target is a bit like a 'method' really.
    I used an Ant script to define all the steps (except for fetching the source code, which jenkins already does) in building plugins/NSFs.

    For example here is my 'target' which performs the build of the Nsf. You can see it calls the powershell executable, passing in arguments for the Execution policy

    The great part about ant, is you can test and run all the build steps on your own computer

    Headless Designer Notes

    Make sure the default user is not prompted for a password
    Make sure the 'Binary DXL' setting for source control is the same as whatever your developers use
    I think we set Designer to only automatically import from On-Disk Project, and Not auto export to disk
    Turn off replicate on startup or any other startup tasks that will slow down designer.

      Jenkins Notes

    I had never actually used Jenkins before, but found it very easy to install and use. If you are struggling to find documentation on some part of Jenkins, look up Hudson as they almost the same system . e.g. the Hudson book seemed pretty comprehensive and even though it wasn't exactly matched to Jenkins, it was enough information to set me in a general direction.

    I used the 'manage plugins' section to download Git Plugin. I did notice a couple of *failed* to install messages on the dependant credentials but after restart Git Plugin worked fine.

    Don't install as a Windows Service

    At first we had Jenkins installed as a windows service. This is it's default and recommended setting.
    This was working fine for building OSGi plugins, however this was a problem for NSF building, as the Headless Designer is not really Headless, it needs to still launch the GUI (it just minimizes it after launch). If you run jenkins as a service, it will fail to launch designer properly. I think that some others had success running as a service and ticking a setting about interacting with desktop, but for me I still had trouble.

    As such, we have to start Jenkins from the console. We do this by navigating to Jenkins directory (C:Jenkins in my case) and then executing
    java -jar jenkins.war

    Jenkins Home

    When we originally started as a service, it used our jenkins install directory C:Jenkins as it's home directory for all configuration etc. We had set up a few jobs, installed some plugins and did some server configuration.
    However when we started it from console, it uses ~/.jenkins as the the home directory for configuration, resulting in a brand new installation. To override this we simply set a JENKINS_HOME environment variable and set this to C:Jenkins, jenkins then returned to using this as it's home directory.

    Jenkins Plugins

    Here are all the Installed Jenkins plugins on our Jenkins server. Some of them may have been installed as a dependency, some of them may have been automatically installed, I can't remember! If you want to know more about one ask me
    • Ant Plugin 1.2  - runs ant build scripts
    • Copy Artifact Plugin 1.30 - allows you to copy built artifacts from one job to another
    • Credentials Plugin 1.10 - I think github plugin uses it to store your username /pword
    • CVS Plugin 2.11
    • Email-ext plugin - more options for sending notification emails
    • External Monitor Job Type Plugin 1.2 - i think was installed as a dependency of another
    • Git Client Plugin 1.6.4 - dependency for Git Plugin
    • Git Plugin 2.0.4 - uses Git!
    • GitHub API Plugin 1.44 - 
    • Github OAuth Plugin 0.14
    • Hudson PowerShell plugin 1.2 - not sure if I actually needed it as I call powershell from within Ant but at one stage I could use this plugin to do it
    • Javadoc plugin 1.1 - generates javadoc for you
    • LDAP Plugin 1.8
    • Mailer 1.8
    • Matrix Authorization Strategy Plugin 1.1 - permissions for jenkins users and jobs etc
    • Maven Project Plugin 2.1 - we don't use (yet ?)
    • OWASP Markup Formatter Plugin 1.0 -  ??
    • PAM Authentication Plugin 1.1
    • promoted builds plugin 2.17 - allows you to take a successful build an 'promote' it for deployment, we don't do this yet but it looks like it could be a good strategy, slightly different to git-flow system
    • SCM API Plugin 0.2 
    • SSH Credentials Plugin 1.6.1
    • SSH Slaves plugin 1.6
    • Subversion plugin 2.2
    • Token Macro plugin 1.10
    • Translation Assistance Plugin 1.11
    • Windows Slaves Plugin 1.0

    Aug 08, 2014
    9 hits

    Recent Blog Posts
    Auto-width Bootstrap Column XPages Controls
    Tue, Mar 21st 2017 1:13p   Cameron Gregor
    I’ve been stuck working with OneUI Version 3 for the past couple of years, due to a regretful decision made at the beginning of my major project. OneUI was better than nothing but very frustrating at times. Finally, I have moved on to my next project and I am now using bootstrap (version 3) A common task when laying out a page using bootstrap is to divide sections up into rows and columns, and use the appropriate css styles to do so. I’m going to assume you are familiar with bo
    Swiper FP8 Version Beta Release
    Thu, Mar 16th 2017 12:50p   Cameron Gregor
    Last week I released the ‘alpha’ version of Swiper which was untested on FP8 but presumed to be ok. So far I have only had good reports from the pioneers who have gone ahead and installed FP8 + the alpha version. I have since managed to ugrade my home office setup to FP8 which unfortunately has broken my ability to launch designer from eclipse but I am seeking some advice on fixing this up. In the meantime I have to test the slow way of building plugins, import plugins, restart R
    Swiper FP8 Integration Rollout
    Wed, Mar 8th 2017 12:43p   Cameron Gregor
    Notes Domino 9.0.1 FP8 is finally here and as far as I know (I have yet to download it) it includes the necessary changes which will allow Swiper to swipe whatever it wants, whenever it wants, which is good news for people who don’t like to have ‘Build Automatically’ turned on. Plan of Attack for release of Swiper version 2.0.0 So, I haven’t actually downloaded FP8 yet,  so I can’t say for sure that the updated version works perfectly. Here is a bit of backgrou
    Markdown XPages UIControl
    Sun, Mar 5th 2017 11:44a   Cameron Gregor
    Often when I’m designing an xpage, there might be a section of the page in which I want to explain some instructions to the user. Some options here are to: write the Instructions using html and embed directly in the xpage markup write the Instructions directly in the design pane and format using designer’s ui e.g. bold, color, size etc use some native xpage controls to achieve the desired result. Write the instructions in a richtext field on a notes document that is loaded to dis
    TextDiff XPages control – For visual comparison of text
    Tue, Feb 28th 2017 12:00p   Cameron Gregor
    A few years back I stumbled across Google’s diff-match-patch project which provides some handy algorithms for text manipulation. At the time of discovery I was doing ‘classic’ notes development. Although I probably could have implemented something that worked in lotuscript with RichText or Mime, it wasn’t a priority at the time and I never bothered. Since then, I have been doing mainly XPages, and now that I have been also doing a bit of XPages Control development. I was
    XPages webmail – Using Mime Inspector to debug Mime
    Tue, Feb 14th 2017 11:12a   Cameron Gregor
    In a previous post in this series I did a bit of an overview on how MIME works. We also did a little bit about how MIME works in XPages + Domino land. With this knowledge in hand we can now start to analyse the different ways a ‘Pretty words, pictures and attachments’ can be stored in the document. During development of the ‘XPages Webmail’ interface, I encountered many problems which could only be solved by investigating the MIME content in detail. To help me do this, I
    Pasting Images into XPages CKEditor
    Sun, Feb 12th 2017 10:00p   Cameron Gregor
    Programs like ‘Snipping Tool’ on Windows, are super useful for users to make a quick snapshot, do some quick markup on the image, paste into chat/email and send. Unfortunately when using the default configuration of CKEditor in XPages (the inputRichText control), support for pasting images is not available for all browsers, and even for the ones that do support it, the images are only pasted as a PNG data URI. I have explained data URI images in a previous post, so check that out if
    Preventing pasting of remotely hosted images in CKEditor
    Mon, Nov 14th 2016 11:21p   Cameron Gregor
    In the previous post, I showed how to prevent a user from pasting Images from the Clipboard into CKEditor. This post is of a similar nature but is designed to ensure that users don’t paste images with URLs to external / internal applications. This post is part of my XPages webmail tips series, and addresses a problem where, a user copies and pastes some HTML that includes images, from a webpage and pastes it into CKEditor for a message that is then sent via email. The recipient is then una
    Preventing Pasting of Images in CKEditor
    Mon, Nov 14th 2016 12:43a   Cameron Gregor
    In the process of developing our XPages ‘Webmail’ interface, we discovered that many recipients were unable to view embedded images in the emails. After investigating, it was caused by the images being embedded using Data URIs. Support for Data URI Images is not universal, and because it is supported in IBM Notes, everything looked like it was working ok, but a quick test viewing an email in Gmail confirmed a problem when images could not be seen. What is a Data URI? You are most lik
    Controlling the order of Script Resources (e.g. Jquery) with a Custom ViewRootRenderer
    Mon, Sep 19th 2016 9:28a   Cameron Gregor
    When loading Client Side Javascript libraries in XPages, sometimes the order that the libraries are ‘encoded’ (or written in HTML) in the tag is important. For example jQuery and some of it’s plugins can have some issues if Dojo is encoded first. By default in XPages you don’t have too much say in what is written out first, a nifty workaround for this has been shared by Sven Hasselbach (here and here) which utilises the lesser known tag. This workaround ensures t

    Created and Maintained by Yancy Lent - About - Planet Lotus Blog - Advertising - Mobile Edition