Domino 11 Jam Coming To London

The Domino jams continue, now onto Domino 11 and with a date of January 15th in London. No location yet but I’d be very surprised if it’s not IBM South Bank.

I attended a couple of jams last year and I can confirm many of the comments made and items requested ended up in the v10 products and several have already been prioritised into v11.  If you are interested in the future of the collaboration products and especially Domino then you will want to contribute ideas to the jam so email Brendan McGuire (MCGUIREB@uk.ibm.com) and ask to attend.

We all hope to be there investing in the future or products we believe in.  Hope to see you there as well.

If you are interested in locations other than London check out this URL  where there are already locations and some dates announced.

#dominoforever

HCL Launch New Collaboration Site & Client Advocacy Program

Today HCL went live with their own site for their collaboration products at https://www.cwpcollaboration.com. It’s Domino-based and we even have new forums you can sign up for (and the sign up process is easy).

The big news for me is the launch of their Client Advocacy Program which you can read about and sign up to on the site. The Client Advocacy program connects customers directly with a technical point of contact in development, it’s free and open for registration now.  You can read more in their FAQ here, but for those of you who are tl:dr here’s a taster.

Why is HCL Client Advocacy participation beneficial?

A Client Advocate provides the participant:

  • opportunity to discuss successes, challenges, and pain points of the customer’s deployment and product usage
  • a collaborative channel to the Offering Management, Support and Development Teams
  • proactive communications on product news, updates, and related events/workshops
  • more frequent touch points on roadmaps and opportunity to provide input on priorities
  • facilitation of lab services engagements or support team as appropriate

You can request to sign up here

I think we can all agree that even in these early days HCL are showing customer focused intent and following up quickly with real actions to reach out and encourage us to talk to them directly.  I know this is just the beginning, the foot is down hard on the acceleration pedal and I’d recommend you follow HCL_CollabDev on twitter as well as the new Collaboration site.  And feed back.  They want to hear what you think and what you want.  If you feel something is missing or you have an idea, feed back.

Above all don’t paint HCL with the IBM brush, this is a new company with new ideas and their own way of doing things.  Exciting times.

Domino – Exchange On Premises Migration Pt1: Migration Tools

It’s been an interesting few months intermittently working on a project to move Notes and Domino users onto Exchange on premises 2013 and Outlook 2013.  I’m going to do a follow up blog talking about Outlook and Exchange behaviour compared to Notes and Domino but let’s start at the beginning, with planning a migration.

The first thing to know is that if your company uses Domino for mail, Exchange on premises is a step down.  I’m sorry but it is and I say this as someone with a lot of experience of both environments (albeit a LOT more in Domino). At the very least you need to allow for the administrative overhead to be larger and to encompass more of your environment. Domino is just Domino on a variety of platforms, Exchange is Active Directory and DNS and networking and a lot more besides.  In fact Microsoft seem to be focusing on making the on premises solution ever more restrictive and difficult to manage (better hope you enjoy Powershell) to encourage you to move to O365.

To give you an example, during the migration we had an issue where mail would suddenly stop sending outbound.  The logs gave no clue, I spent 2 days on it finding nothing and eventually decided to pay Microsoft to troubleshoot with me to find out what I’d done wrong.  5 hrs of joint working later we found it.  It wasn’t Exchange or any box I worked on.  It was one of the Domain Controllers that didn’t have a service running on it (kerberos key distribution center) that was causing the issue.  Started that service on that box and all was fine.  Three days wasted but at least it wasn’t what I did 🙂

MIGRATION TOOLS

First of all we need a migration tool unless you’re one of the increasingly large number of companies who just decide to start clean.  This is especially true when moving to O365 because there often isn’t either the option or the capability to upload terabytes or even gigabytes of existing mail to the cloud.  Having tested 5 different tools for this current project here were my biggest problems:

  1. A tool that was overly complex to install, outdated (requiring a Windows 7 OS) and the supplier wanted several thousand dollars to train me on how to install it
  2. Tools that didn’t migrate the data quiitteee right. It looked good at first glance but on digging deeper there were misfiled messages and calendar entries missing
  3. Tools that took an unfeasibly long time (>12hrs per mail file or even days).  The answer to that problem was offered as “you are migrating too much, we never do that” or “you need a battalian of workstations to do the migration”
  4. Tools that required me to migrate everything via their cloud service i.e send every message through their servers¨. I mean it works and requires little configuration but no.  Just no.

Whatever tool you decide to use I would recommend testing fully against one of your largest mail files and calculating the time taken against what that does to your project plan.  For my current smaller project I am using a more interactive tool that installed on a workstation and didn’t require any changes on either the Domino or Exchange end.

You’ll notice I’m not naming the tools here.  Although there are a couple where the supplier was so arrogant and unhelpful I’d like to name them, there are also several who were incredibly helpful and just not the right fit for this project.  Maybe for the next.  The right migration tool for you is the one that does the work you need in the time you need and has the right support team behind it to answer finicky questions like ‘what happened to my meeting on 3rd June 2015 which hasn’t migrated”.  Test. Test. Test.

Many of the migration tools are very cheap but be careful that some of the cheapest aren’t making their money off consultancy fees if paying them is the only way to make the product work.

QUESTIONS

So our first question is

“What do you want to migrate?”

Now the answer to this will initially often be “everything” but that means time and cost and getting Exchange to handle much larger mailboxes than it is happy to do.  That 30GB Domino mailfile won’t be appreciated by Exchange so the second question is

“Would you consider having archives for older data and new mailboxes for new”

You also need to ask about rooms and resources and shared mailboxes as well as consider how you are going to migrate contacts and if there needs to be a shared address book.  The migration of mail may be the easiest component of what you are planning.

Now we need to talk about coexistence.  Unless you plan to cutover during a single period of downtime during which no mail is available you will need a migration tool that can handle coexistence with people gradually moving to Exchange and still able to work with those not yet migrated from Domino without any barrier in between.  Coexistence is a lot more complex than migration and the migration tools that offer it require considerably more configuration and management for coexistence than they do for the migration.  Consider as well that your coexistence period could be months or even years.

One option, if the company is small enough, is to migrate the data and then plan a cutover period where you do an incremental update.  Updating the data every week incrementally allows you to cutover fairly quickly and also gives a nice clean rollback position.

EXCHANGE CONFIGURATION

The biggest issue in migrating from Domino to Exchange is how long it takes getting the data from point A to point B.  I tried a variety of migration tools and a 7GB mail file took anywhere from 3hrs to 17hrs to complete.  Now multiply that up.   Ensuring your Domino servers, migration workstations and Exchange servers are located on the same fast network is key.

Make sure your Exchange server is configured not to throttle traffic (because it will see that flood of migration data as needing throttling) so configure a disabled/unlimited throttling policy you can apply during the migration.

Exchange’s malware filter, which is installed by default and only has options for deleting messages or deleting their attachments, is not your friend during a migration.  Not only will it delete your Domino mail that it decides could be malware as it migrates but it also slows the actual migration down to a crawl whilst it does that.  You can’t delete the filter but you can temporarily disable it via Powershell.

Next up.. the challenges of the Outlook / Exchange model to a Notes / Domino person.

 

 

 

 

 

Deploying The AppDev Pack – An Admins Guide

Over here on the blog is Tim’s next entry talking about Node development and Domino, this time he explains how to use the early release of the app dev package to access (read and write) Domino data via Node.  However I don’t let developers do Domino admin so this is the bit where I explain how to configure Domino.  It’s all very easy and also all still early release so things may well change for GA.

First you will need to request the early release package which you can do here. What you’ll then get is a series of .tgz files including one entitled ‘domino-appdev-docs-site.tgz’ which, once extracted, gives you the index.html with instructions for installing.

You need to bear in mind that at least initially this only runs on Linux and Domino 10 and that Domino 10 on Linux 64bit officially means RHEL 7.4 or higher, or SLES 12. I went with RHEL 7.5.

Next we need to install  “Proton” so it can be run as a Domino server task which just means extracting the file ‘proton-addin.tgz’ into the /opt/ibm/domino/notes/latest/linux directory.   There is also some checking to make sure files are present and setting permissions but I don’t want to repeat the install instructions here as I would rather you refer to the latest official version of those.  Suffice it to say this is a 5 minute job at most.

Once the files are in place you can start and stop Proton as you would any other Domino task by doing “load Proton”, “tell Proton quit”, etc.

Then there are a few notes.ini settings you can choose to set including:

PROTON_SSL
= if you want the traffic between the Proton task and Node server to be encrypted (0/1).

PROTON_LISTEN_PORT= what port you want Proton to listen and be accessed by Node on (default 3002 ).

PROTON_LISTEN_ADDRESS= if you want Proton to listen on a specific address on your Domino server such as 127.0.0.1 which would require Node to be installed locally or 0.0.0.0 which will listen on any available address.

PROTON_AUTHENTICATION= how Proton handles authentication.  There are currently two options, client_cert or anonymous.  With authentication set to anonymous all requests that come from the Node application are done as an “anonymous” Domino user and your Domino application must allow Anonymous rights in the ACL.

The “client_cert” option requires the Node application to present a client certificate to the Proton task and for the Domino administrator to have already mapped that certificate to a specific person document by importing it.  Note that “client_cert” still means that all activity from that Node application will be done as a single identified user that must be in the ACL but does mean you need not allow anonymous access.  You can also use different identities in different Node applications.

Of course, what we all want is OAuth or an authentication model that allows individual user identities and this is hopefully why the product is still considered “early release”.   Both the “anonymous” and “client_cert” models are of limited use in production.

PROTON_KEYFILE
= the keyfile to use if you want PROTON to be communicating using SSL.  This isn’t releated to the Domino keyfile (although it could be) and since this is only for communication between your Node server and your Domino Proton task and never for client-facing traffic you could use entirely internally-generated keys since they only need to be shared with the Node server itself.

HCL have kindly provided scripts to generate all the certificates you need for your testing.

Finally we need to create a design catalog for Proton to use.  You can add individual databases to the design catalog and the first one you add actually creates the catalog.  There must be a catalog with at least one database in it for Proton to work at all.

The catalog contains an index of all the design elements in a Domino database so to add a new database to the catalog you would type:
load updall <database> -e

This isn’t dynamically maintained though, so if you change the design of a database you must update its entry in the catalog if you want to have new design elements added or updated, like this:
load updall <database path> -d

The purpose of the catalog is to speed up DQL’s access to the Domino data.  It’s not required that every database be catalogued but obviously doing so speeds up access and opens up things like view scanning using the <‘View or folder name’>.<Columnname> syntax.

Proton

So that’s my very quick admin guide to what I did that enabled Tim to do what he does. It’s very possible (even probable) that this entire blog will be obsolete when the GA release ships but hopefully this and Tim’s blog help you get started with the early release.

Deletion Logs – What’s Coming In V10

So deletion logs.. currently (without custom code) we cannot tell who deleted a document and what document they deleted in which database.  With v10 deletion logging is now a standard trigger on the database that creates an entry in a delete.log file in the IBM_TECHNICAL_SUPPORT directory detailing every deletion activity.

So how does it work?

Deletion logging is enabled via the compact task on an individual database basis. The option -dl is used when compacting a database along with the fields in that database you want to be part of the log. For example if I wanted to turn it on for my mail file I might do

load compact mail\gdavis.nsf -dl on subject,posteddate,sendto,recipient

Every deletion after that point would then be logged as a single CSV entry in delete.log.  Note there are standard values that are always logged in addition to the custom fields I requested

“20180210T211516,06+01″,
“Mail\gdavis.nsf”,
“80256487:00352154″, “nserver”,”CN=Traveler/O=Turtle”,
“SOFT”, “0001″,”72C0E3F8:44B53FB5DC4EDBF8:A785466D”,
“from”,”””New Relic”  –
 “<marketing@newrelic.com>”, “sendto”,”gabriella@turtle.com”, “deliveredDate”,”02/10/2018 21:05:05”, “posteddate”,”02/10/2018 16:15:18″

There are several interesting aspects to this approach but I see it being particularly powerful for audit purposes, as it shows not only the message but the timestamp of the deletion and who did it.   Note that the server name in the log entry here tells me my Traveler server did the deletion so it was done from my phone, if it had been deleted in the Notes client it would have my name there as the person who did the deletion.

The delete.log itself rolls over each time the server is restarted but obviously depending on the size of your environment and how widely you deploy deletion logging that’s a CSV file you are going to want to have a strategy for.

7 days and counting