Folder Sync v10 #DOMINO10 #DOMINO2025

Next up in “cool admin things coming your way in v10” - folder syncing.  By selecting a folder on a cluster instance you can tell the server to keep that folder in sync across the entire cluster.   The folder can contain database files (NSFs and NTFs) but also NLOs.

Well that’s just dumb Gab.. NLOs are encrypted by the server ID so they can’t be synced across clustermates but a-ha! HCL are way ahead of you.  The NLO sync involves the source server decrypting the NLO before syncing it to the destination where it re-encrypts it before saving.

So no more making sure databases are replicated to every instance in a cluster.  No more creating mass replicas when adding a new server to the cluster or building a new server and no more worrying about missing NLOs if you copy over a DAOS enabled database and not its associated NLO files.

Genius.

File Repair v10 #Domino10 #Domino2025

If you follow this blog you know that v10 of Domino, Sametime, Verse on Premises, Traveler etc are all due out this year and I want to do some - very short - blog pieces talking about new features and what my use case would be for them.

So let’s start with FILE REPAIR (or whatever it’s going to be called)

The File Repair feature for Domino v10 is designed to auto repair any corrupted databases in a cluster. Should Domino detect a corruption on any of its databases that are clustered, it automatically removes the corrupted instance and pulls a new instance from a good cluster mate. Best of all this happens super fast, doesn’t use regular replication to repopulate, doesn’t require downtime and the cluster manager is fully aware of the database availability throughout.

I can think of plenty of instances where I have had a corrupted database that I can’t replace or fix without server downtime.  No more, and another good reason to cluster your servers.

 

Definitely different - a few days looking into the future with HCL (and IBM)

If this blog is tl:dr then here’s your takeaway

I can’t thank everyone at HCL enough for throwing open the doors and leaving them open. Together we will continue to innovate great things for customers

Last week Tim and I were invited to the 1st CWP Factory tour held by HCL at their offices in Chelmsford.  “CWP” stands for “Collaboration Workflow Platform” and includes not only the products HCL took over from IBM late last year such as Domino, Traveler, Verse on Premises and Sametime but also new products that HCL are developing as extensions of those.  These (that I can talk a little bit about) such as HCL Nomad (Notes for iPad) and HCL Places (a new client runnvetaing against Domino 10 and providing integrated collaborative services such as chat, AV , web and Notes applications) will be leapfrogging Domino far over its competitors.

I want to start by thanking HCL for inviting us inside to see their process. We met and made our voices heard with more than 30 developers and executives, all of who wanted to know “do you like this?”  “what are we missing?”.  I came away from the two days with a to-do list of my own at the request of various people to send in more details of problems or requirements I had mentioned when there.  John Paganetti, who is also a customer advocate at HCL, hosted the “ask the developers” impromptu session (we had so many questions so they threw one into the agenda on day 2).  We were told to get to know and reach out directly to the teams with our feedback and questions.  If you don’t have a route to provide feedback and want one then please reach out.

Back in February I attended a Domino Jam hosted by Andrew Manby (@andrewmanby) from IBM in London.  These were held all over the world and attendees were pushed to brainstorm around features that were missing or needed.  That feedback was used to create priorities for v10 and many of the features requested at my session and others have appeared in the current beta and are committed to a v10 release.  At the end of the 2nd day of the factory tour we again had a Domino Jam hosted by Andrew Manby but this time for Domino 11 features - wheeeeeeee!   With the Jams and the Destination Domino blog as well as the #domino2025 hashtag activity, IBM are really stepping up to the products in a way they haven’t in several years.  I want to recognise the hard work being done by Andrew, by Uffe Sorensen, and by Mat Newman amongst others, to make this IBM/HCL relationship work.

So what was the factory tour? It was a 2 day conference held at HCL’s (still being built) offices. I am pleased to say it was put together very informally, we were split into groups of about 10 (hi Daniel, Francie, Julian, Richard, Paul, Nathan, Devin, Fabrice!) and one by one the development teams came and took our feedback on the work they are doing.  We worked with the Verse (on premises) team, the TCO team (looking at the Domino and Sametime servers), the Notes client team, the Nomad team and the Application Development team.  It was an intense day in a good way with so much information being shared with us and questions being asked of us.  It was also good to be told that the majority of what we saw and discussed could be shared publicly.

A few highlights (out of many) from the two days that were new to me:

  • The new database repair and folder sync features in Domino 10 (shame on me for not remembering what they are called). The database repair feature will detect when a database is corrupted and replace it whilst the server is running with a new instance from a working cluster mate (another good reason to cluster).  The folder sync feature will keep any  Domino database files or NLOs in any listed folders in sync.  This stuff is so cool and exactly what Domino clustering needed so we asked for them to extend the sync feature to include any files in the HTML directory such as HTML CSS and CGI scripts and they are considering that (v10 is a tight delivery timeline right now so no guarantees of anything).
  • Some very candid discussions (I think repeated multiple times by everyone there) about getting rid of WebSphere for Sametime in the future and how to better provide Sametime services purely under Domino.
  • HCL Places looking much evolved even in the few weeks since it was first shown at Engage - this is going to be a game changer client when it comes out.
  • The Domino General Query Facility (DGQF) available in Domino 10 is the biggest investment in Notes/Domino code in 10 years. A query language accessible outside Domino that doesn’t require any  knowledge of Domino design by a developer.  Using DGQF you can rapidly query collections of documents represented by any criteria not necessarily views or forms.  Using DGQF a regular web developer would be able to build a Node application, for instance, using back-end Domino data without ever having to learn the structure of the Domino database or touch Domino Designer.  Here’s a sneaky picture I took of the positioning for DGQF.John Curtis who is the lead designer behind DGQF has been very responsive on twitter to questions about how it will work (@john_d_curtis)
  • A lot of stuff Nomad and Node related which is still NDA but you’ll hear more about them at Collabsphere in Ann Arbor - HCL will be out in force as will IBM speaking, showing and listening so if you can you need to get yourself there.   Turn out and turn up - there’s still time to get your voice heard.

 

Improving Node.js with Express

By Tim Davis - Director of Development

In my previous post I talked about what Node.js is and described how to create a very simple Node web server. In this post I would like to build on that and look at how to flesh out our server into something more substantial and how to use add-on modules.

To do this we will use the Express module as an example. This is a middleware module that provides a huge variety of pre-built web server functions, and is used on most Node web servers. It is the ‘E’ in the MEAN/MERN stacks.

Why should we use Express, since we already have our web server working? Really for three main reasons. The first is so you don’t have to write all the web stuff yourself in raw Node.js. There is nothing stopping you doing this if you need something very specific, but Express provides it all out of the box. One particularly useful feature is being able to easily set up ‘routes’. Routes are mappings from readable url paths to the more complicated things happening in the background, rather like the Web Site Rules in your Domino server. Express also provides lots of other useful functions for handling requests and responses and all that.

The second reason is that it is one of the most popular Node modules ever. I don’t mean therefore you should use it because everyone else does, but its popularity means that it is a de facto standard and many other modules are designed to integrate with it. This leads us nicely back around to the Node integration with Domino 10. The Node integration with Domino 10 is based on the Loopback adaptor module. Loopback is built to work with Express and is maintained by StrongLoop who are an IBM company, and now StrongLoop are looking after Express as well. Everything fits together.

The third and final reason is a selfish one for you as a developer. If you can build your Node server with Express, then you are halfway towards the classic full JavaScript stack, and it is a small step from there to creating sites with all the froody new client-side frameworks such as Angular and React. Also, you will be able to use Domino 10 as your back-end full-stack datastore and build DEAN/NERD apps.

So, in this post I will take you through how to turn our simple local web server into a proper Node.js app, capable of running stand-alone (e.g. maybe one day in a docker container), and then modify the code to use the Express module. This can form the basis of almost any web server project in the future.

First of all we should take a minute or two to set up our project more fully. We do this by running a few simple commands with the npm package manager that we installed alongside Node last time.

We first need to create one special file to sit alongside our server.js, namely ‘package.json’. This is a text file which contains various configuration settings for our app, and because we want to use an add-on module we especially need its ‘dependencies’ section.

What is great is we don’t have to create this ourselves. It will be automatically created by npm. In our project folder, we type the following in a terminal or command window:

npm init

This prompts you for the details of your app, such as name, version, description, etc. You can type stuff in or just press enter to accept the defaults for now. When this is done we will have our package.json created for us. It isn’t very interesting to look at yet.

We don’t have to edit this file ourselves, this is done automatically by npm when we install things.

First, lets install a local version of Node.js into our project folder. We installed Node globally last time from the download, but a local version will keep everything contained within our project folder. It makes our project more portable, and we can control versions, etc.

We install Node into our project folder like this:

npm install node

The progress is displayed as npm does its job. We may get some warnings, but we don’t need to worry about these for now.

If we look in our project folder we will see a new folder has been created, ‘node_modules’. This has our Node install in it. Also, if we look inside our package.json file we will see that it has been updated. There is a new “dependencies” section which lists our new “node” install, and a “start” script which is used to start our server with the command “node server.js”. You may remember this command from last time, it is how we started our simple Node server.

We can now start our server using this package.json. We will do this using npm, like this:

npm start

This command runs the “start” script in our package.json, which we saw runs the “node server.js” command which we typed manually last time, and our server starts up just like before, listening away. You can imagine how using a package.json file gives us much more control over how our Node app runs.

Next we want to add the Express module. You can probably already guess what to type.

npm install express

When this is finished and we look inside our package.json, we have a new dependency listed: “express”. We also have many more folders in our node_modules subfolder. Express has a whole load of other modules that it uses and they have all been installed automatically for us by npm.

Now we have everything we need to start using Express functions in our server.js, so lets look at what code we need.

First we ‘require’ Express. We don’t need to require HTTP any more, because Express will handle all this for us. So we can change our require line to this:

const express = require('express')

Next thing to do is to create an Express ‘app’, which will handle all our web stuff. This is done with this line:

const app = express()

Our simple web server currently sends back a Hello World message when someone visits. Lets modify our code to use Express instead of the native Node HTTP module we used last time.

This is how Express sends back a Hello World message:

app.get('/', (req, res) => { 
   res.send('Hello World!') 
} )

Hopefully you can see what this is doing, it looks very similar to the http.createServer method we used previously.

The ‘app.get’ means it will listen for regular browser GET requests. If we were sending in form data, we would probably want to instead listen for a POST request with ‘app.post’.

The ‘/’ is the route path pattern that it is listening for, in this case just the root of the server. This path pattern matching is where the power of Express comes in. We can have multiple ‘app.get’ commands matching different paths to map to different things, and we can use wildcards and other clever features to both match and get information out of the incoming URLs. These are the ‘routes’ I mentioned earlier, sort of the equivalent of Domino Web Site Rules. They make it easy to keep the various, often complex, functions of a web site separate and understandable. I will talk more about what we can do with routes in a future blog.

So our app will listen for a browser request hitting the root of the server, e.g. http://127.0.0.1:3000, which we used last time. The rest of the command is telling the app what to do with it. It is a function (using the arrow ‘=>’ notation) and it takes the request (‘req’) and the response (‘res’) as arguments. We are simply going to send back our Hello World message in the response.

So we now have our simple route set up. The last thing we need to do, same as last time, is to tell our app to start listening:

app.listen(port, hostname, () => { 
   console.log(`Server running at http://${hostname}:${port}/`); 
});

You may notice that this is exactly the same code as last time, except we tell the ‘app’ to listen instead of the ‘server’. This helps illustrate how well Express is built on Node and how integrated it all is.

Our new updated server.js should look like this:

const express = require('express');
const hostname = '127.0.0.1';
const port = 3000;
const app = express();
app.get('/', (req,res)=> {
   res.send("Hello World!")
});
app.listen(port, hostname, () => {
   console.log(`Server running at http://${hostname}:${port}/`);
});

This is one less line than before. If we start the server again by typing ‘npm start’ and then browse to http://127.0.0.1:3000, we get our Hello World message!

Now, this is all well and good, but aren’t we just at the same place as when we started? Our Node server is still just saying Hello World, what was the point of all this?

Well, what we have now, that we did not have before, is the basis of a framework for building proper, sophisticated web applications. We can use npm to manage the app and its add-ons and dependencies, the app is self-contained so we can move it around or containerise it, and we have a fully-featured middleware (i.e. Express) framework ready to handle all our web requests.

Using this basic structure, we can build all sorts of things. We would certainly start by adding the upcoming Domino 10 connector to access our Domino data in new ways, and then we could add Angular or React (or your favourite JS client platform) to build a cool modern web UI, or we could make it the server side of a mobile app. If your CIO is talking about microservices, then we can use it to microserve Domino data. If your CIO is talking about REST then we can provide higher-level business logic than the low-level Domino REST API.

In my next blog I plan to talk about more things we can do with Node, such as displaying web pages, about how it handles data (i.e. JSON), and about how writing an app is both similar and different to writing an app in Domino.