Language Packs, Verse and A New App Dev Pack, Someone Has Had A Busy Week..

Well a bit more than a week.

This week the G1 language packs for Notes 10.0.1 that include French, German, Japanese, Italian, Brazilian, Chinese and Korean were made available.  If you are now having a bit of deja vu that’s because these language packs were already released once and very quickly withdrawn when it was discovered there were considerable problems in the way the translations had been done.  To their credit HCL withdrew the products almost immediately when they were told of the issues and have been working to redo and re-release them all.

So why were the bad versions released at all?  This goes back to the transfer from IBM to HCL.  In the IBM days there was a large team who were entirely responsible for product translation but weren’t part of the collaboration development team, they were just a general IBM product translation team.  When HCL took over the products they didn’t inherit that team which meant they also didn’t inherit any of the knowledge that team had about the quirks and challenges when doing the Notes translations.  HCL went ahead with having the translations done without realising the hurdles they needed to verify. None of that is great but in my opinion it shows commitment and intent that they withdrew the products almost immediately and then made redoing the translations correctly their highest priority.  They have also committed to a day 1 release of G1 languages with English in future versions.

So we had a stumble but one that was publicly claimed, explained and fixed quickly.  I can’t expect more than that.

Last week was the release of the new update to App Dev pack for Domino v 1.0.1 which includes the Node.js integration features that can now be depoloyed under Windows as well as Linux.

The new IAM (Identity and Access Management) service provides OAuth authentication for applications running outside Domino to Domino resources.  By installing IAM you can authorise it to use LDAP over Domino or Active Directory as its IdP (identity provider) for authenticating users.  There are a few steps in setting up IAM including setting up secure LDAPS in Domino or Active Directory so I’ll be covering that in more detail on its own blog.

More on the App Dev Pack update here and IAM specifically here

Last week we also got an update to Verse on Premises ( v1.0.7 ) which I have rolled out for a few customers so far.   The deployment if you already have verse installed is very easy (just make sure you back up your Plugins folder before deleting the old files). Here is a list of new features including some significant calendar enhancements and work towards providing the Verse UI on mobile browsers where it’s not appropriate to use the Verse app.

Lastly I heard very good things about the Connections workshop (jam) in Switzerland this week with the product team working to brainstorm ideas on Connections wanted features.  I will be attending the London workshop next week and look forward to hearing more.

 

 

 

 

 

The Painful Journey To Abandoning iCloud

As some of you know I’m very committed to the Mac ecosystem.  I have Mac laptops, an iPad with over 4000 books, iPhone (not the latest because who needs that), a watch Apple TVs x 4 etc etc.  I’m also extremely risk averse and cloud wary.  I gave in and let Apple put all my books in the cloud just because iTunes sucks for syncing and cloud syncing worked across all my devices however I also had a lengthy open support call last year with Apple wanting to know where my books were now stored on my Mac so I could find them and back them up

“they are all in the cloud”

“yes I get that but they are also on my laptop so where are they”

“no they are only in the cloud”

“well that’s not true because here I go, switching off wifi and hey I can still read my books in ibooks so they are here somewhere”

..>> pause for several weeks whilst this is escalated>>>>

“they are on your Macbook but stored in a way you can’t find them or access them”

(please no advice on this one, I found my own workaround to find them and backup un-DRM copies)

So.. iCloud. I agreed about 18 months’ ago to let my Documents and Desktop folders sync to iCloud.  My only reason for that was so that I could get at files if I needed to on my iPad or by logging into any browser but tbh I rarely used it.  Still it worked and seemed a decent idea.

Then one Saturday about two weeks’ ago it all went horribly wrong..

I was sat working when I got an alert saying facetime had been added to my watch.  Which was odd.  My watch is 18 months old and was on my wrist and nothing had changed. The watch itself had no alert.  So off I go digging and I find under my account and devices a list of my current watch and an old watch I wiped and sold to a friend to give to his wife 18 months ago.  Well still odd but no big deal.  They hadn’t done anything so clearly just an odd gremlin.  Just in case I removed that old watch from my devices.

Then I got alerts saying my credit cards had been removed from my watch.  Except they hadn’t been removed from the watch on my wrist and the other watch was flattened before I handed it over 18 months ago.

I did some research, found nothing nefarious and let it go.   I did notice I had been logged out of all my Apple accounts on all my devices and things like Sonos had to be re-authorised again.  Weird and annoying but a side effect of whatever happened I assume**

Then a few days later I restarted my laptop.  I probably only restart it every two weeks so this was the first time since that alert.  The laptop restarts but finder and anything that uses finder like spotlight or even terminal were entirely non responsive.  They would briefly work long enough for me to type 2 characters or click on a folder then there would be a spinning ball for about 25 seconds before it would respond.  That gradually got slower and slower over a few minutes.  So off we go to research because I now have a broken laptop.

After several hours research we found this article which gave a bit of a clue as it pointed to a cloud corruption problem http://osxdaily.com/2015/04/17/fix-slow-folder-populating-cloudkit-macosx/

Unlike some of the other Finder troubles, the Finder process usually doesn’t eat much CPU or crash repeatedly, it’s just inordinately slow when loading folder views, populating files, and opening folders.

So I followed the instructions and deleted the files they specify and immediately my laptop was more responsive.  OK.. Well that was a scary afternoon and I’ll just go ahead and disable cloud syncing so that never happens again.

Did you know Apple doesn’t let you do that?  If you disable cloud syncing for Documents anbd Desktop it actually deletes the contents of those folders and keeps the files in the cloud for 30 days in case you want them back.  So that’s dumb.  I decided to move the contents of both folders to temporary folders, disable cloud syncing then move them back but my laptop was working and I was busy so I parked that for later.

Later…. about a week later again the Finder sluggishness came back but this time I knew how to fix it.  Once it was fixed I went ahead and moved the contents of both Documents and Desktop to temporary folders, disabled cloud syncing and moved them back.  My laptop immediately started working, finder was faster than it had been for a very long time and I’ve had no more problems.

Now I wonder if that first alert about this non existent “watch” was a precursor to some cloud corruption on my account.  That cloud corruption caused all the authentication for my account to be lost and also corrupted the authentication for my cloud data which only tried to reconnect when I signed back into the OS.

** for anyone who was wondering if I had asked an apple “genius” about this. Yes I did. No they had no clue what I was talking about since most of them are “iphone experts” in store now and the one who called me back seemed to think I made it up.

Lesson learned. Apple iCloud for all but my books is now disabled.

Think-Ing From Far Away Pt1 – Community

Today’s podcast of Think-ing from far away features guests Libby Ingrasia from IBM, Rob Novak from Snapps, John Paganetti from HCL and Femke Goedhart from panagenda alongside Julian Robichaux, Theo Heselmans and myself.  We discuss the ICS Community, what community events are happening at Think and how to play along from home.

Monday’s podcast is here http://www.nsftools.com/tffa/TFFA_1.mp3

In the podcast we mention hashtags , blogs and twitter accounts those of us who aren’t at Think should keep an eye on this week and I wanted to summarise some of those here

Watch sessions live stream including the Chairman’s address on Tuesday here https://www.ibm.com/events/think/watch/

Tomorrow our podcast will be talking about some specific sessions we hope to hear news from this week and the blogs of the people giving them.

Twitter

⁦@IBM @ Think

@IBMSocialBiz

@IBMChampions

#Think2019

#IBMThink2019

@HCL_CollabDev

@IBMLive

@planetlotus 

IBM Champions- All – List

Blogs

PlanetLotus http://planetlotus.org

IBM Collaboration Solutions Blog https://www.ibm.com/blogs/collaboration-solutions/

HCL Collaboration Workflow Platforms https://www.cwpcollaboration.com/blogs

Aha! Domino Ideas Lab https://domino.ideas.aha.io

Aha! Connections Ideas Lab https://connections.ideas.aha.io

Collaboration Today https://collaborationtoday.info

Other In Person Events Already Announced For 2019

https://engage.ug

https://collabsphere.org

https://admincamp.de

https://dnug.de

https://isbg.no

https://socialconnections.info

 

 

Domino – Exchange On Premises Migration Pt2: Wrestling the Outlook Client

In part 1 of my blog about Exchange on premises migration from Domino I talked about the challenges of working with Exchange for someone who is used to working with Domino.  If only that were all of it but now I want to talk about the issues around Outlook and other Exchange client options that require those of us used to working with Domino to change our thinking.

In Domino we are used to a mail file being on the server and regardless of whether we used Notes or a browser to see that client, the data is the same.  Unless we are using a local replica, but the use of that is very clear when we are in the database as it visibly shows “on Local” vs the server name.

We can also easily swap between the local and server replicas and even have both open at the same time.

In Outlook you only have the option to open a mailbox in either online or cached mode.

So let’s talk about cached mode because that’s the root of our migration pains. You must have a mail profile defined in Windows in order to run Outlook. The default setting for an Outlook profile is “cached mode” and that’s not very visible to the users. The screenshot below is what the status bar shows when you are accessing Outlook in cached mode.

connectedtoexchange

In cached mode there is a local OST file that syncs with your online Exchange mailbox.  It’s not something you can access or open outside of Outlook.

datafiles

Outlook will always use cached mode unless you modify the settings of the data file or the account to disable it.

cachedsettings

As you can see from the configuration settings below, a cached OST file is not the same as a local replica and it’s not designed to be.   The purpose of the cached mail is to make Outlook more efficient by not having everything accessed on the server.

cachedoffline

Why does this matter during a migration?  Most migration tools can claim to be able to migrate directly to the server mailboxes but in practice the speed of that migration is often unworkably slow.  If that can be achieved it’s by far the most efficient but Exchange has its own default configuration settings that work against you doing that including throttling of activity and filtering / scanning of messages.   Many / most migration tools do not expect to migrate “all data and attachments” which is what we are often asked to do.  If what we are aiming for is 100% data parity between a Domino mail file and an Exchange mailbox then migrating that 5GB, 10GB, 30GB volume directly to the server isn’t an option.  In addition if a migration partially runs to the server and then fails it’s almost impossible to backfill the missing data with incremental updates.  I have worked with several migration tools testing this and just didn’t have confidence in the data population directly on the server.

In sites where I have done migrations to on premises servers I’ve often found the speed of migration to the server mailbox on the same network makes migration impossible so instead I’ve migrated to a local OST file.  The difference between migrating a 10GB file to a local OST (about an hour) vs directly to Exchange (about 2.5 days) is painfully obvious. Putting more resources onto the migration machine didn’t significantly reduce the time and in fact each tool either crashed (running as a Domino task) or crashed (running as a Windows desktop task) when trying to write directly to Exchange.

An hour or two to migrate a Domino mail file to a local workstation OST isn’t bad though right?  That’s not bad at all, and if you open Outlook you will see all the messages, folders, calendar entries, etc, all displaying.  However that’s because you’re looking at cached mode. You’re literally looking at the data you just migrated.  Create a profile for the same user on another machine and the mail file will be empty because at this point there is no data in Exchange, only in the local OST.  Another thing to be aware of is that there is no equivalent of an All Documents view in Outlook so make sure your migration tool knows how to migrate unfoldered messages and your users know where to find them in their new mailbox.

Now to my next struggle.  Outlook will sync that data to Exchange.  It will take between 1 and 3 days to do so.  I have tried several tools to speed up the syncing and I would advise you not to bother.  The methods they use to populate the Exchange mailbox from a local OST file sidestep much of the standard Outlook sync behaviours meaning information is often missing or, in one case, it sent out calendar invites for every calendar entry it pushed to Exchange.  I tried five of those tools and none worked 100%. The risk of missing data or sending out duplicate calendar entries/emails was too high.  I opted in the end to stick with Outlook syncing.  Unlike Notes replication I can only sync one OST / Outlook mailbox at a time so it’s slow going unless I have multiple client machines. What is nice is that I can do incremental updates quickly once the initial multi-GB mailbox has synced to Exchange.

So my wrestling with the Outlook client boils down to

  • Create mail profiles that use cached mode
  • Migrate to a local OST
  • Use Outlook to sync that to Exchange
  • Pay attention to Outlook limits, like a maximum of 500 folders*
  • Be Patient

*On Domino mailboxes we migrated that pushed up against the folder or item limits we found Outlook would run out of system memory repeatedly when trying to sync.

One good way to test whether the Exchange data matches the Domino data is to use Outlook Web Access as that is accessing data directly on the Exchange server.  Except that’s not as identical to the server data as we are used to seeing with Verse or iNotes.  In fact OWA too decides to show you through a browser what it thinks you most need to see versus everything that’s there.  Often folders will claim to be empty and that there is no data when in fact that data is there but hasn’t been refreshed by Exchange (think Updall).  There are few things more scary in OWA than an empty folder and a link suggesting you refresh from the server.  It just doesn’t instill confidence in the user experience.

Finally we have Outlook mobile or even using the native iOS mail application.  That wasn’t a separate configuration and unless you configure Exchange otherwise the default is that mobile access will be granted to everyone.   In one instance a couple of weeks ago, mobile access suddenly stopped working for all users who hadn’t already set up their devices.  When they tried to log in they got invalid name or password.  I eventually tracked that down to a Windows update that had changed permissions in Active Directory that Exchange needed set.  You can see reference to the issue here, and slightly differently here, although note it seems to have been an issue since Exchange 2010 and still with Exchange 2016.  I was surprised it was broken by a Windows update but it was.

I know (and have used) many workarounds for the issues I run into but that’s not for here.  Coming from a Domino and Notes background I believe we’ve been conditioned to think in a certain way about mailfile structure, server performance, local data, and the user experience, and expecting to duplicate that exactly is always going to be troublesome.

#DominoForever

 

 

 

 

 

 

Domino – Exchange On Premises Migration Pt2: Wrestling the Outlook Client

In part 1 of my blog about Exchange on premises migration from Domino I talked about the challenges of working with Exchange for someone who is used to working with Domino.  If only that were all of it but now I want to talk about the issues around Outlook and other Exchange client options that require those of us used to working with Domino to change our thinking.

In Domino we are used to a mail file being on the server and regardless of whether we used Notes or a browser to see that client, the data is the same.  Unless we are using a local replica, but the use of that is very clear when we are in the database as it visibly shows “on Local” vs the server name.

We can also easily swap between the local and server replicas and even have both open at the same time.

In Outlook you only have the option to open a mailbox in either online or cached mode.

So let’s talk about cached mode because that’s the root of our migration pains. You must have a mail profile defined in Windows in order to run Outlook. The default setting for an Outlook profile is “cached mode” and that’s not very visible to the users. The screenshot below is what the status bar shows when you are accessing Outlook in cached mode.

connectedtoexchange

In cached mode there is a local OST file that syncs with your online Exchange mailbox.  It’s not something you can access or open outside of Outlook.

datafiles

Outlook will always use cached mode unless you modify the settings of the data file or the account to disable it.

cachedsettings

As you can see from the configuration settings below, a cached OST file is not the same as a local replica and it’s not designed to be.   The purpose of the cached mail is to make Outlook more efficient by not having everything accessed on the server.

cachedoffline

Why does this matter during a migration?  Most migration tools can claim to be able to migrate directly to the server mailboxes but in practice the speed of that migration is often unworkably slow.  If that can be achieved it’s by far the most efficient but Exchange has its own default configuration settings that work against you doing that including throttling of activity and filtering / scanning of messages.   Many / most migration tools do not expect to migrate “all data and attachments” which is what we are often asked to do.  If what we are aiming for is 100% data parity between a Domino mail file and an Exchange mailbox then migrating that 5GB, 10GB, 30GB volume directly to the server isn’t an option.  In addition if a migration partially runs to the server and then fails it’s almost impossible to backfill the missing data with incremental updates.  I have worked with several migration tools testing this and just didn’t have confidence in the data population directly on the server.

In sites where I have done migrations to on premises servers I’ve often found the speed of migration to the server mailbox on the same network makes migration impossible so instead I’ve migrated to a local OST file.  The difference between migrating a 10GB file to a local OST (about an hour) vs directly to Exchange (about 2.5 days) is painfully obvious. Putting more resources onto the migration machine didn’t significantly reduce the time and in fact each tool either crashed (running as a Domino task) or crashed (running as a Windows desktop task) when trying to write directly to Exchange.

An hour or two to migrate a Domino mail file to a local workstation OST isn’t bad though right?  That’s not bad at all, and if you open Outlook you will see all the messages, folders, calendar entries, etc, all displaying.  However that’s because you’re looking at cached mode. You’re literally looking at the data you just migrated.  Create a profile for the same user on another machine and the mail file will be empty because at this point there is no data in Exchange, only in the local OST.  Another thing to be aware of is that there is no equivalent of an All Documents view in Outlook so make sure your migration tool knows how to migrate unfoldered messages and your users know where to find them in their new mailbox.

Now to my next struggle.  Outlook will sync that data to Exchange.  It will take between 1 and 3 days to do so.  I have tried several tools to speed up the syncing and I would advise you not to bother.  The methods they use to populate the Exchange mailbox from a local OST file sidestep much of the standard Outlook sync behaviours meaning information is often missing or, in one case, it sent out calendar invites for every calendar entry it pushed to Exchange.  I tried five of those tools and none worked 100%. The risk of missing data or sending out duplicate calendar entries/emails was too high.  I opted in the end to stick with Outlook syncing.  Unlike Notes replication I can only sync one OST / Outlook mailbox at a time so it’s slow going unless I have multiple client machines. What is nice is that I can do incremental updates quickly once the initial multi-GB mailbox has synced to Exchange.

So my wrestling with the Outlook client boils down to

  • Create mail profiles that use cached mode
  • Migrate to a local OST
  • Use Outlook to sync that to Exchange
  • Pay attention to Outlook limits, like a maximum of 500 folders*
  • Be Patient

*On Domino mailboxes we migrated that pushed up against the folder or item limits we found Outlook would run out of system memory repeatedly when trying to sync.

One good way to test whether the Exchange data matches the Domino data is to use Outlook Web Access as that is accessing data directly on the Exchange server.  Except that’s not as identical to the server data as we are used to seeing with Verse or iNotes.  In fact OWA too decides to show you through a browser what it thinks you most need to see versus everything that’s there.  Often folders will claim to be empty and that there is no data when in fact that data is there but hasn’t been refreshed by Exchange (think Updall).  There are few things more scary in OWA than an empty folder and a link suggesting you refresh from the server.  It just doesn’t instill confidence in the user experience.

Finally we have Outlook mobile or even using the native iOS mail application.  That wasn’t a separate configuration and unless you configure Exchange otherwise the default is that mobile access will be granted to everyone.   In one instance a couple of weeks ago, mobile access suddenly stopped working for all users who hadn’t already set up their devices.  When they tried to log in they got invalid name or password.  I eventually tracked that down to a Windows update that had changed permissions in Active Directory that Exchange needed set.  You can see reference to the issue here, and slightly differently here, although note it seems to have been an issue since Exchange 2010 and still with Exchange 2016.  I was surprised it was broken by a Windows update but it was.

I know (and have used) many workarounds for the issues I run into but that’s not for here.  Coming from a Domino and Notes background I believe we’ve been conditioned to think in a certain way about mailfile structure, server performance, local data, and the user experience, and expecting to duplicate that exactly is always going to be troublesome.

#DominoForever

 

 

 

 

 

 

Whooomf – All Change. HCL Buys The Shop…

According to this Press Release as of mid June 2019, HCL take ownership of a bunch of IBM products including Notes, Domino and Connections on premises. Right now and since late 2017 there has been a partnership with IBM on some of the products such as Notes, Domino, Traveler and Sametime* so this will take IBM out of the picture entirely. Here are my first “oh hey it’s 4am” thoughts on why that’s not entirely surprising or unwelcome news ..

HCL are all about leading with on premises, not cloud. The purchase of Connections is for on premises and there are thousands of customers who want to stay on premises. Every other provider is either entirely Cloud already or pushing their on premises customers towards it by starving their products of development and support (waves at Microsoft). *cough*revenue stream*cough*

HCL have shown in 2018 that they can innovate (Domino’s TCO offerings, Notes on the iPad, Node integration etc) , develop quickly and deliver on their promises. That’s been a refreshing change.

They must be pleased with the current partnership products to buy them and more outright.

When HCL started the partnership with IBM they brought on some of the best of the original IBM Collaboration development team and have continued to recruit at high speed. It was a smart move and one I hope they repeat across not just development but support and marketing too.

HCL already showed with “Places” that they have ideas for how collaboration tools could work (see this concept video https://youtu.be/CJNLmBkyvMo) and that’s good news for Connections customers who gain a large team and become part of a bigger collaboration story in a company that “gets it”.

Throughout 2018 HCL have made efforts to reach out repeatedly to customers and Business Partners, asking for our feedback and finding out what we want. From sponsoring user group events (and turning up in droves) around the world to hosting the factory tour in June at their offices in Chelmsford where we had two days of time with the developers and their upcoming technologies. I believe they have proven they understand what this community is about and how much value comes from listening and – yes – collaborating.

Tonight I am more optimistic for the future of these products and especially Connections than I have been in a while. HCL, to my experience, behave more like a software start up than anything else, moving fast, changing direction if necessary and always trying to lead by innovating. I hope many of the incredibly smart people at IBM (yes YOU) who have stood alongside these products for years do land at HCL if that’s what they want, it would be a huge loss if they don’t.

*HCL have confirmed that Sametime is included

Perfect10 – Building A Test Lab

In the 10th edition of my Perfect 10 webcast I explain how and why to build a test lab so you can get to deploying those products you downloaded today. You did download them right?

I was asked recently to share the slides for all these Perfect10 presentations and to be honest I hadn’t thought of it but it’s a good idea so I’ll be sharing them all this week.

Next up: Let’s Talk Domino v10 & Admin

11 mins