Problems With MongoDB During Sametime Install

In my previous blogs I discussed installing Sametime 10. My installs all ran perfectly except for one point - when I tried to install MongoDB as a service it installed but then wouldn’t start with the service error “1053”, reported as “service did not respond to the start or control request in a timely fashion”.

I thought I had fixed the problem by using another method to install Mongo. From my previous blog:

This is what the documentation said to use (sc is found in c:\windows\system32 if your path can’t find it)
sc.exe create MongoDB binPath=”\”C:\Program Files\MongoDB\Server\3.6\bin\mongod.exe\” –service — config=\”C:\Program Files\MongoDb\Server\3.6\mongod.cfg”” DisplayName= “MongoDB” start= “auto”

I ended up removing that service since it wouldn’t start (sc delete MongoDB) and adding it using a different syntax from the Mongo bin directory itself

mongod –directoryperdb -dbpath C:\data\mongodb\ –logpath C:\data\mongodb\log\mongo.log –logappend –service –install

Turns out I should have stuck with the documentation and using “sc” but the 1053 error was caused by two problems - the first was a misprint in the documentation as you can see above, there are two “” after mongod.cfg instead of one so that line should read (with added escape character)

sc.exe create MongoDB binPath=”\”C:\Program Files\MongoDB\Server\3.6\bin\mongod.exe\” –service — config=\”C:\Program Files\MongoDb\Server\3.6\mongod.cfg\”” DisplayName= “MongoDB” start= “auto”

the second problem was in the mongod.cfg file so when Mongo tried to start and read it, it failed.

I had copied the contents of the Mongod.cfg from the documentation into a text file so I didn’t consider that would be an issue but the pasted content removed the spaces and tabs at the beginning of each line in the documentation and that meant my mongod.cfg file wouldn’t work. This is what it should look like, spaces and tabs included.

Once that was complete everything worked perfectly. allowing the service to start in the way Sametime wanted and allowed me to run the command “rs.initiate()” from the Mongo console.

Thank you to Tony Payne @ HCL for working with me on this last week.

Domino Query Language @ Engage

By Tim Davis, Technical Director

This is my session given at Engage 2019 in Brussels last week.

“In this session, Tim Davis (Technical Director at The Turtle Partnership Ltd) takes you through the new Domino Query Language (DQL), how it works, and how to use it in LotusScript, in Java, and in the new domino-db Node.js module. Introduced in Domino 10, DQL provides a simple, efficient and powerful search facility for accessing Domino documents. Originally only used in the domino-db Node.js module, with 10.0.1 DQL also became available to both LotusScript and Java. This presentation will provide code examples in all three languages, ensuring you will come away with a good understanding of DQL and how to use it in your projects.”

Exchange 2019 On Prem Install

In a couple of weeks time I’ll be in Brussels presenting at Engage and one of my sessions is Face/Off Domino vs Exchange On Premises (Weds at 8am).  I have an Exchange 2016 install but since Exchange 2019 shipped last October I wanted to update my install with that so I could use the latest version to demo.  In truth very little has changed in Exchange on premises since 2008 but I don’t like using an old version in my presentations.  So this is the story of the 4 days it took me to complete the install.

Four. Days.

Day 1: My big mistake.  I decided to uninstall Exchange 2016 instead of upgrading it. I wanted an entirely clean server to demonstrate.  The uninstall failed half way through.  It wouldn’t uninstall and it was still listed under installed programs.  Several hours of trial and error and internet research confirmed this is a common problem with Exchange uninstalls and the “fix” is to flatten the machine and start over.  The problem was the Exchange install was on the same box as the Active Directory 2016 Domain Controller which I really really didn’t want to flatten.

Day 2: Being Stubborn.  I’d do just about anything to avoid flattening the entire box and rebuilding so some more internet research took me to several blogs that talked about manually removing registry entries in order to clean up the install.  Hundreds of registry entries.  After doing that I still couldn’t delete or rename the folder despite no services being present so then it was into safe mode to do the rename.  That worked and I started the upgrade to Windows 2019 (the only supported platform for Exchange 2019). You can now do an inplace Windows upgrade from 2016 to 2019 and that worked maintaining all my Active Directory settings.

Day 3: Accepting the inevitable. Off I go with an Exchange 2019 install once more which started to install then prompted me for the Exchange installer disk.  It wouldn’t take the mounted disk I had started the installer from.  After a few hours’ research I realised this is a common red herring error that basically means the server can detect some old installation files and won’t complete.  At this point there were no services, no directory, nothing listed under installed programs.  Sometimes you have to accept you’ve strayed too many hacks from your starting point it’s best to startover and do it properly.  Windows 2019 install #2 this time letting it blat the server and rebuilding Active Directory from scratch (luckily it’s just my demo machine and I could do that but good luck if it’s your production environment).

At the end of day 3 I had a new Windows 2019 Domain Controller fully patched and I was ready to start my Exchange 2019 install.

Day 4: The Long Road.  Before Exchange will install the installer program will verify you have all the pre-requisites required on the operating system.  There are many from IIS management tools to .Net 4.7.1 to the basic authentication system.  A scrolling page of missing features is shown with URL links explaining them.  Since 90% of those features were actually Windows features you go to add/remove features to install I don’t know why the Exchange installer doesn’t just offer to install them for me because it took some time to work out where in the multi level hierarchy of features each one was.  In addition serveral of the URLs brought up 404 pages on the Microsoft site refering to Exchange 2003 and that link not being available(!).  Anyway finally after a few hours of digging around, downloading libraries, installing features and restarting it agreed to install Exchange 2019 and I was done.

If you take one lesson from this it should be that the Microsoft solution to many problems seems to be “flatten and start over”.  For that reason I wouldn’t put Exchange on any machine you wouldn’t be happy to flatten and start over or replace.

 

Paypal and Direct Debits

What have you spent your morning doing Gab?  That would be removing over 60 direct debits set up in paypal since 2008.

Last week Pluralsight from whom I bought a 1 year license in 2018 went ahead without notice and charged me £199 for “another year” because I hadn’t checked the box to disable their auto renewal buried under my account details.  Strange they can email me multiple times a week with marketing promotions and telling me about courses but apparently can’t email me to tell me they will be charging me another year on X date and that I need to disable auto renew if I don’t want them to.  Even if I clearly haven’t logged in for months.

Lesson learned.  I cancelled the auto renewal, swallowed the cost and won’t ever use them again.

Today there was a charge from AVG Commerce for £99.99.  I haven’t used AVG in years having switched to BitDefender. The last charge was several years ago.  Apparently they just suddenly decided to charge me 2 year’s renewal for a product that had long expired, again because the auto renew was left on the account and because paypal had them as a direct debit.   The last charge was in July 2016 and this one was April 2019 so not even a renewal date.  At least AVG (who have multiple complaints of this behaviour on their site) offer a 30 day refund which I have applied for.  Luckily I could do that without logging in since their login is now a Salesforce login and no account details I have work.

I often use Paypal to pay for things because I would rather not share my credit card with every site but of course the downside is that Paypal won’t dispute a payment like that whereas my credit card company wouild.  So in I go to Paypal to deactivate all the direct debits that are on my account.

There were over 60.  Many times when I paid for anything with Paypal , even a one off thing like a game or theatre tickets it set itself up without telling me as a direct debit.  That means Paypal would have allowed that source to take payment anytime it wanted without notifying me until it was done.  Wordpress and GoDaddy were particularly egregious with multiple direct debits, one for every payment I ever made and all had to be deleted.

None of this would be an issue if Paypal would notify me when someone applied to withdraw money via direct debit or if they had a limit by date or expiry on how long the direct debit was valid, or even if they didn’t bury the direct debits far away from my home page.

I recommend if you use Paypal you go in and deactivate the direct debits you might unwittingly have in place.

Login to Paypal - choose “Settings” (the cog) and choose “Payments” then “Manage Pre-Approved Payments” - go ahead and cancel whatever you need.  I went from 72 to 5.

 

 

More Apple Cloud Issues & The Solution

If you saw my earlier post on my trouble with my  Apple iCloud account it looked by the end of it I had found a resolution (original post here https://turtleblog.info/2019/03/08/the-painful-journey-to-abandoning-icloud/).

In summary it appeared (from my investigation) that my iCloud account somehow got corrupted or modified in such a way that it became unusable and I ended up disabling iCloud storage for documents and desktop amongst other things to fix it.

Except it didn’t entirely fix it.

For the past few weeks I’ve had endless problems with Finder being slow. Spinning ball and up to 10 seconds to change folders, open a fie dialog in any application - longer for dragging and dropping files.  In addition Spotlight was broken, it would let me type a few characters then take around 10 seconds to fill in any suggestions which I then had to wait another 5 seconds or so to click on.

I tried all the fixes I could find including rebuilding Spotlight, removing the CloudMetadata.xml files and resetting preferences.  Things were definitely “usable” but not the way I found acceptable ie. how it used to be.  So last week I had a call with an Apple “genius”.  Two resets, a boot into safe mode (which took 45 minutes and caused hin to exclaim “oh my god”) and finally I caved and reinstalled the OS entirely (which leaves everything else in place).  The guy ended up ghosting me when I pointed out that he wasn’t actually identifying the problem, just hitting it with ever bigger hammers in the hopes it would go away and that if this final hammer (reinstalling the OS) didn’t work he better have a plan.

So this week I was back on my own with a Mac that was still slow and convinced even flattening and rebuiding wouldn’t necessarily help if the issue was related to my iCloud account.  Then I read a few community comments that put me on the right path - I removed all internet accounts from under system preferences.  In my case all I had was my iCloud account.  To remove it it had to save a copy of all my iCloud data locally and remove all my credit cards.  Then I removed the Cloud metadata (rm ~/Library/Caches/CloudKit/CloudKitMetadata*;killall cloudd) and restarted.

I reattached my iCloud account and configured what I wanted to sync, added my credit cards and went to bed.  This morning everything is working as it should including Finder and Spotlight so we’re back to the status before the iCloud corruption.  It seems the single only fix I needed was to remove the iCloud account from system preferences entirely and then re-add it.

 

Language Packs, Verse and A New App Dev Pack, Someone Has Had A Busy Week..

Well a bit more than a week.

This week the G1 language packs for Notes 10.0.1 that include French, German, Japanese, Italian, Brazilian, Chinese and Korean were made available.  If you are now having a bit of deja vu that’s because these language packs were already released once and very quickly withdrawn when it was discovered there were considerable problems in the way the translations had been done.  To their credit HCL withdrew the products almost immediately when they were told of the issues and have been working to redo and re-release them all.

So why were the bad versions released at all?  This goes back to the transfer from IBM to HCL.  In the IBM days there was a large team who were entirely responsible for product translation but weren’t part of the collaboration development team, they were just a general IBM product translation team.  When HCL took over the products they didn’t inherit that team which meant they also didn’t inherit any of the knowledge that team had about the quirks and challenges when doing the Notes translations.  HCL went ahead with having the translations done without realising the hurdles they needed to verify. None of that is great but in my opinion it shows commitment and intent that they withdrew the products almost immediately and then made redoing the translations correctly their highest priority.  They have also committed to a day 1 release of G1 languages with English in future versions.

So we had a stumble but one that was publicly claimed, explained and fixed quickly.  I can’t expect more than that.

Last week was the release of the new update to App Dev pack for Domino v 1.0.1 which includes the Node.js integration features that can now be depoloyed under Windows as well as Linux.

The new IAM (Identity and Access Management) service provides OAuth authentication for applications running outside Domino to Domino resources.  By installing IAM you can authorise it to use LDAP over Domino or Active Directory as its IdP (identity provider) for authenticating users.  There are a few steps in setting up IAM including setting up secure LDAPS in Domino or Active Directory so I’ll be covering that in more detail on its own blog.

More on the App Dev Pack update here and IAM specifically here

Last week we also got an update to Verse on Premises ( v1.0.7 ) which I have rolled out for a few customers so far.   The deployment if you already have verse installed is very easy (just make sure you back up your Plugins folder before deleting the old files). Here is a list of new features including some significant calendar enhancements and work towards providing the Verse UI on mobile browsers where it’s not appropriate to use the Verse app.

Lastly I heard very good things about the Connections workshop (jam) in Switzerland this week with the product team working to brainstorm ideas on Connections wanted features.  I will be attending the London workshop next week and look forward to hearing more.

 

 

 

 

 

The Painful Journey To Abandoning iCloud

As some of you know I’m very committed to the Mac ecosystem.  I have Mac laptops, an iPad with over 4000 books, iPhone (not the latest because who needs that), a watch Apple TVs x 4 etc etc.  I’m also extremely risk averse and cloud wary.  I gave in and let Apple put all my books in the cloud just because iTunes sucks for syncing and cloud syncing worked across all my devices however I also had a lengthy open support call last year with Apple wanting to know where my books were now stored on my Mac so I could find them and back them up

“they are all in the cloud”

“yes I get that but they are also on my laptop so where are they”

“no they are only in the cloud”

“well that’s not true because here I go, switching off wifi and hey I can still read my books in ibooks so they are here somewhere”

..>> pause for several weeks whilst this is escalated>>>>

“they are on your Macbook but stored in a way you can’t find them or access them”

(please no advice on this one, I found my own workaround to find them and backup un-DRM copies)

So.. iCloud. I agreed about 18 months’ ago to let my Documents and Desktop folders sync to iCloud.  My only reason for that was so that I could get at files if I needed to on my iPad or by logging into any browser but tbh I rarely used it.  Still it worked and seemed a decent idea.

Then one Saturday about two weeks’ ago it all went horribly wrong..

I was sat working when I got an alert saying facetime had been added to my watch.  Which was odd.  My watch is 18 months old and was on my wrist and nothing had changed. The watch itself had no alert.  So off I go digging and I find under my account and devices a list of my current watch and an old watch I wiped and sold to a friend to give to his wife 18 months ago.  Well still odd but no big deal.  They hadn’t done anything so clearly just an odd gremlin.  Just in case I removed that old watch from my devices.

Then I got alerts saying my credit cards had been removed from my watch.  Except they hadn’t been removed from the watch on my wrist and the other watch was flattened before I handed it over 18 months ago.

I did some research, found nothing nefarious and let it go.   I did notice I had been logged out of all my Apple accounts on all my devices and things like Sonos had to be re-authorised again.  Weird and annoying but a side effect of whatever happened I assume**

Then a few days later I restarted my laptop.  I probably only restart it every two weeks so this was the first time since that alert.  The laptop restarts but finder and anything that uses finder like spotlight or even terminal were entirely non responsive.  They would briefly work long enough for me to type 2 characters or click on a folder then there would be a spinning ball for about 25 seconds before it would respond.  That gradually got slower and slower over a few minutes.  So off we go to research because I now have a broken laptop.

After several hours research we found this article which gave a bit of a clue as it pointed to a cloud corruption problem http://osxdaily.com/2015/04/17/fix-slow-folder-populating-cloudkit-macosx/

Unlike some of the other Finder troubles, the Finder process usually doesn’t eat much CPU or crash repeatedly, it’s just inordinately slow when loading folder views, populating files, and opening folders.

So I followed the instructions and deleted the files they specify and immediately my laptop was more responsive.  OK.. Well that was a scary afternoon and I’ll just go ahead and disable cloud syncing so that never happens again.

Did you know Apple doesn’t let you do that?  If you disable cloud syncing for Documents anbd Desktop it actually deletes the contents of those folders and keeps the files in the cloud for 30 days in case you want them back.  So that’s dumb.  I decided to move the contents of both folders to temporary folders, disable cloud syncing then move them back but my laptop was working and I was busy so I parked that for later.

Later…. about a week later again the Finder sluggishness came back but this time I knew how to fix it.  Once it was fixed I went ahead and moved the contents of both Documents and Desktop to temporary folders, disabled cloud syncing and moved them back.  My laptop immediately started working, finder was faster than it had been for a very long time and I’ve had no more problems.

Now I wonder if that first alert about this non existent “watch” was a precursor to some cloud corruption on my account.  That cloud corruption caused all the authentication for my account to be lost and also corrupted the authentication for my cloud data which only tried to reconnect when I signed back into the OS.

** for anyone who was wondering if I had asked an apple “genius” about this. Yes I did. No they had no clue what I was talking about since most of them are “iphone experts” in store now and the one who called me back seemed to think I made it up.

Lesson learned. Apple iCloud for all but my books is now disabled.

Think-Ing From Far Away Pt1 - Community

Today’s podcast of Think-ing from far away features guests Libby Ingrasia from IBM, Rob Novak from Snapps, John Paganetti from HCL and Femke Goedhart from panagenda alongside Julian Robichaux, Theo Heselmans and myself.  We discuss the ICS Community, what community events are happening at Think and how to play along from home.

Monday’s podcast is here http://www.nsftools.com/tffa/TFFA_1.mp3

In the podcast we mention hashtags , blogs and twitter accounts those of us who aren’t at Think should keep an eye on this week and I wanted to summarise some of those here

Watch sessions live stream including the Chairman’s address on Tuesday here https://www.ibm.com/events/think/watch/

Tomorrow our podcast will be talking about some specific sessions we hope to hear news from this week and the blogs of the people giving them.

Twitter

⁦@IBM @ Think

@IBMSocialBiz

@IBMChampions

#Think2019

#IBMThink2019

@HCL_CollabDev

@IBMLive

@planetlotus 

IBM Champions- All - List

Blogs

PlanetLotus http://planetlotus.org

IBM Collaboration Solutions Blog https://www.ibm.com/blogs/collaboration-solutions/

HCL Collaboration Workflow Platforms https://www.cwpcollaboration.com/blogs

Aha! Domino Ideas Lab https://domino.ideas.aha.io

Aha! Connections Ideas Lab https://connections.ideas.aha.io

Collaboration Today https://collaborationtoday.info

Other In Person Events Already Announced For 2019

https://engage.ug

https://collabsphere.org

https://admincamp.de

https://dnug.de

https://isbg.no

https://socialconnections.info

 

 

Domino - Exchange On Premises Migration Pt2: Wrestling the Outlook Client

In part 1 of my blog about Exchange on premises migration from Domino I talked about the challenges of working with Exchange for someone who is used to working with Domino.  If only that were all of it but now I want to talk about the issues around Outlook and other Exchange client options that require those of us used to working with Domino to change our thinking.

In Domino we are used to a mail file being on the server and regardless of whether we used Notes or a browser to see that client, the data is the same.  Unless we are using a local replica, but the use of that is very clear when we are in the database as it visibly shows “on Local” vs the server name.

We can also easily swap between the local and server replicas and even have both open at the same time.

In Outlook you only have the option to open a mailbox in either online or cached mode.

So let’s talk about cached mode because that’s the root of our migration pains. You must have a mail profile defined in Windows in order to run Outlook. The default setting for an Outlook profile is “cached mode” and that’s not very visible to the users. The screenshot below is what the status bar shows when you are accessing Outlook in cached mode.

connectedtoexchange

In cached mode there is a local OST file that syncs with your online Exchange mailbox.  It’s not something you can access or open outside of Outlook.

Outlook will always use cached mode unless you modify the settings of the data file or the account to disable it.

As you can see from the configuration settings below, a cached OST file is not the same as a local replica and it’s not designed to be.   The purpose of the cached mail is to make Outlook more efficient by not having everything accessed on the server.

Why does this matter during a migration?  Most migration tools can claim to be able to migrate directly to the server mailboxes but in practice the speed of that migration is often unworkably slow.  If that can be achieved it’s by far the most efficient but Exchange has its own default configuration settings that work against you doing that including throttling of activity and filtering / scanning of messages.   Many / most migration tools do not expect to migrate “all data and attachments” which is what we are often asked to do.  If what we are aiming for is 100% data parity between a Domino mail file and an Exchange mailbox then migrating that 5GB, 10GB, 30GB volume directly to the server isn’t an option.  In addition if a migration partially runs to the server and then fails it’s almost impossible to backfill the missing data with incremental updates.  I have worked with several migration tools testing this and just didn’t have confidence in the data population directly on the server.

In sites where I have done migrations to on premises servers I’ve often found the speed of migration to the server mailbox on the same network makes migration impossible so instead I’ve migrated to a local OST file.  The difference between migrating a 10GB file to a local OST (about an hour) vs directly to Exchange (about 2.5 days) is painfully obvious. Putting more resources onto the migration machine didn’t significantly reduce the time and in fact each tool either crashed (running as a Domino task) or crashed (running as a Windows desktop task) when trying to write directly to Exchange.

An hour or two to migrate a Domino mail file to a local workstation OST isn’t bad though right?  That’s not bad at all, and if you open Outlook you will see all the messages, folders, calendar entries, etc, all displaying.  However that’s because you’re looking at cached mode. You’re literally looking at the data you just migrated.  Create a profile for the same user on another machine and the mail file will be empty because at this point there is no data in Exchange, only in the local OST.  Another thing to be aware of is that there is no equivalent of an All Documents view in Outlook so make sure your migration tool knows how to migrate unfoldered messages and your users know where to find them in their new mailbox.

Now to my next struggle.  Outlook will sync that data to Exchange.  It will take between 1 and 3 days to do so.  I have tried several tools to speed up the syncing and I would advise you not to bother.  The methods they use to populate the Exchange mailbox from a local OST file sidestep much of the standard Outlook sync behaviours meaning information is often missing or, in one case, it sent out calendar invites for every calendar entry it pushed to Exchange.  I tried five of those tools and none worked 100%. The risk of missing data or sending out duplicate calendar entries/emails was too high.  I opted in the end to stick with Outlook syncing.  Unlike Notes replication I can only sync one OST / Outlook mailbox at a time so it’s slow going unless I have multiple client machines. What is nice is that I can do incremental updates quickly once the initial multi-GB mailbox has synced to Exchange.

So my wrestling with the Outlook client boils down to

  • Create mail profiles that use cached mode
  • Migrate to a local OST
  • Use Outlook to sync that to Exchange
  • Pay attention to Outlook limits, like a maximum of 500 folders*
  • Be Patient

*On Domino mailboxes we migrated that pushed up against the folder or item limits we found Outlook would run out of system memory repeatedly when trying to sync.

One good way to test whether the Exchange data matches the Domino data is to use Outlook Web Access as that is accessing data directly on the Exchange server.  Except that’s not as identical to the server data as we are used to seeing with Verse or iNotes.  In fact OWA too decides to show you through a browser what it thinks you most need to see versus everything that’s there.  Often folders will claim to be empty and that there is no data when in fact that data is there but hasn’t been refreshed by Exchange (think Updall).  There are few things more scary in OWA than an empty folder and a link suggesting you refresh from the server.  It just doesn’t instill confidence in the user experience.

Finally we have Outlook mobile or even using the native iOS mail application.  That wasn’t a separate configuration and unless you configure Exchange otherwise the default is that mobile access will be granted to everyone.   In one instance a couple of weeks ago, mobile access suddenly stopped working for all users who hadn’t already set up their devices.  When they tried to log in they got invalid name or password.  I eventually tracked that down to a Windows update that had changed permissions in Active Directory that Exchange needed set.  You can see reference to the issue here, and slightly differently here, although note it seems to have been an issue since Exchange 2010 and still with Exchange 2016.  I was surprised it was broken by a Windows update but it was.

I know (and have used) many workarounds for the issues I run into but that’s not for here.  Coming from a Domino and Notes background I believe we’ve been conditioned to think in a certain way about mailfile structure, server performance, local data, and the user experience, and expecting to duplicate that exactly is always going to be troublesome.

#DominoForever

 

 

 

 

 

 

Domino - Exchange On Premises Migration Pt2: Wrestling the Outlook Client

In part 1 of my blog about Exchange on premises migration from Domino I talked about the challenges of working with Exchange for someone who is used to working with Domino.  If only that were all of it but now I want to talk about the issues around Outlook and other Exchange client options that require those of us used to working with Domino to change our thinking.

In Domino we are used to a mail file being on the server and regardless of whether we used Notes or a browser to see that client, the data is the same.  Unless we are using a local replica, but the use of that is very clear when we are in the database as it visibly shows “on Local” vs the server name.

We can also easily swap between the local and server replicas and even have both open at the same time.

In Outlook you only have the option to open a mailbox in either online or cached mode.

So let’s talk about cached mode because that’s the root of our migration pains. You must have a mail profile defined in Windows in order to run Outlook. The default setting for an Outlook profile is “cached mode” and that’s not very visible to the users. The screenshot below is what the status bar shows when you are accessing Outlook in cached mode.

connectedtoexchange

In cached mode there is a local OST file that syncs with your online Exchange mailbox.  It’s not something you can access or open outside of Outlook.

Outlook will always use cached mode unless you modify the settings of the data file or the account to disable it.

As you can see from the configuration settings below, a cached OST file is not the same as a local replica and it’s not designed to be.   The purpose of the cached mail is to make Outlook more efficient by not having everything accessed on the server.

Why does this matter during a migration?  Most migration tools can claim to be able to migrate directly to the server mailboxes but in practice the speed of that migration is often unworkably slow.  If that can be achieved it’s by far the most efficient but Exchange has its own default configuration settings that work against you doing that including throttling of activity and filtering / scanning of messages.   Many / most migration tools do not expect to migrate “all data and attachments” which is what we are often asked to do.  If what we are aiming for is 100% data parity between a Domino mail file and an Exchange mailbox then migrating that 5GB, 10GB, 30GB volume directly to the server isn’t an option.  In addition if a migration partially runs to the server and then fails it’s almost impossible to backfill the missing data with incremental updates.  I have worked with several migration tools testing this and just didn’t have confidence in the data population directly on the server.

In sites where I have done migrations to on premises servers I’ve often found the speed of migration to the server mailbox on the same network makes migration impossible so instead I’ve migrated to a local OST file.  The difference between migrating a 10GB file to a local OST (about an hour) vs directly to Exchange (about 2.5 days) is painfully obvious. Putting more resources onto the migration machine didn’t significantly reduce the time and in fact each tool either crashed (running as a Domino task) or crashed (running as a Windows desktop task) when trying to write directly to Exchange.

An hour or two to migrate a Domino mail file to a local workstation OST isn’t bad though right?  That’s not bad at all, and if you open Outlook you will see all the messages, folders, calendar entries, etc, all displaying.  However that’s because you’re looking at cached mode. You’re literally looking at the data you just migrated.  Create a profile for the same user on another machine and the mail file will be empty because at this point there is no data in Exchange, only in the local OST.  Another thing to be aware of is that there is no equivalent of an All Documents view in Outlook so make sure your migration tool knows how to migrate unfoldered messages and your users know where to find them in their new mailbox.

Now to my next struggle.  Outlook will sync that data to Exchange.  It will take between 1 and 3 days to do so.  I have tried several tools to speed up the syncing and I would advise you not to bother.  The methods they use to populate the Exchange mailbox from a local OST file sidestep much of the standard Outlook sync behaviours meaning information is often missing or, in one case, it sent out calendar invites for every calendar entry it pushed to Exchange.  I tried five of those tools and none worked 100%. The risk of missing data or sending out duplicate calendar entries/emails was too high.  I opted in the end to stick with Outlook syncing.  Unlike Notes replication I can only sync one OST / Outlook mailbox at a time so it’s slow going unless I have multiple client machines. What is nice is that I can do incremental updates quickly once the initial multi-GB mailbox has synced to Exchange.

So my wrestling with the Outlook client boils down to

  • Create mail profiles that use cached mode
  • Migrate to a local OST
  • Use Outlook to sync that to Exchange
  • Pay attention to Outlook limits, like a maximum of 500 folders*
  • Be Patient

*On Domino mailboxes we migrated that pushed up against the folder or item limits we found Outlook would run out of system memory repeatedly when trying to sync.

One good way to test whether the Exchange data matches the Domino data is to use Outlook Web Access as that is accessing data directly on the Exchange server.  Except that’s not as identical to the server data as we are used to seeing with Verse or iNotes.  In fact OWA too decides to show you through a browser what it thinks you most need to see versus everything that’s there.  Often folders will claim to be empty and that there is no data when in fact that data is there but hasn’t been refreshed by Exchange (think Updall).  There are few things more scary in OWA than an empty folder and a link suggesting you refresh from the server.  It just doesn’t instill confidence in the user experience.

Finally we have Outlook mobile or even using the native iOS mail application.  That wasn’t a separate configuration and unless you configure Exchange otherwise the default is that mobile access will be granted to everyone.   In one instance a couple of weeks ago, mobile access suddenly stopped working for all users who hadn’t already set up their devices.  When they tried to log in they got invalid name or password.  I eventually tracked that down to a Windows update that had changed permissions in Active Directory that Exchange needed set.  You can see reference to the issue here, and slightly differently here, although note it seems to have been an issue since Exchange 2010 and still with Exchange 2016.  I was surprised it was broken by a Windows update but it was.

I know (and have used) many workarounds for the issues I run into but that’s not for here.  Coming from a Domino and Notes background I believe we’ve been conditioned to think in a certain way about mailfile structure, server performance, local data, and the user experience, and expecting to duplicate that exactly is always going to be troublesome.

#DominoForever