Monday 7 December 2015

A year in the cloud with IBM (Part 1)

So ... it is very nearly a year since my last post and what a year it has been!

I have been a busy boy!

I and my team of admins have moved the entire European and Asian workforce from our on-prem Domino servers to the IBM Smartcloud servers. We elected to have a Hybrid environment keep our many and varied apps on a couple of on-prem servers and shift mail totally to the IBM cloud servers (What used to be called IBM Smartcloud). We also elected, thanks to the IBM UK's account team, to make the majority of our users "Full"  users provisioned with the complete menu of interesting stuff that the cloud offers: Mail (Verse), Meetings, Chat, Connections, Traveler and Archive Essentials.

Now I could say that the migration and provisioning of our users was a smooth and fault free experience, but I can't. We ran up against some issues where provisioning was fraught with problems that reduced the migration to a crawl. These problems have since been addressed and from early March this year we have had no problems at all.

The on-boarding tools when we started did not really suit what we needed to do. While I would have preferred to leave the old mail files in place as archives where they could access and manage their old mail as normal, starting the users with empty mail files in the cloud this was considered by the users community to be a "non runner". Neither did we want to migrate nearly a petabytes of old mail to the servers so we reached a compromise and moved 8-12 weeks of mail and calendar data from the live mail file to the cloud leaving the old mail file as a local replica on the workspace as an archive. (Apart that is from accountants, what is it about accountants that they need every mail they every received since 1995? *sigh*) So in the absence of a free-tool (I was on a very tight budget) that would do what I need. I wrote a set of agents that would move :
  • Folders
  • Rules
  • Profiles
  • Mail
  • Calendar 
  • Todo
By date to the cloud.

This worked really quite well apart from a few gotchas the main one of these being Google Meeting Invites, not all of them just the ones that have "Never" as a "repeats ends" attribute. This, I discovered, creates a 10 year repeat notes calendar entry if the user accepts it. So that daily conference call had 1000's of dates in the calendar doc. That needed some serious tweaking!

We had a milestone date of April 1st to get the Asian and European workforce migrated and with the help of our long suffering on-boarding team and the local support folks in IBM Dublin, (who can now swear almost as well as me) we managed to get the last planned user migrated on the 4th April, which all in all was  excellent. The problems we did have were 99% invisible to the users, all they saw was my team coming around warning them they would be moved sometime in the next 24 hours and they were.

Having moved the users' mail to the cloud, we started consolidating data onto what will become our on-prem App Servers, most of these had been doubling as mail servers and suddenly with no mail running they started to preform much better.

The old QUICKR server was a bit of a problem. The quickr environment was very stable and just sat in the corner and ran year after year, every now and then needing more disk space and a fixpack. Once again we had a what to do with the data? Quite a few of the places where there purely for historical purposed so they were put on the "Whenever" low priority list. We focused on the places currently in fairly constant use and created a connections community for each place.

Quickr Files were dead easy. 
1. Set up a Quickr Connector on my PC to the place, copied the files to a local directory
2. Piped a DIR to a text file
3. Set up a wee PHP server on my PC using XAMPP
4. Using PHP read the file from 2, checked for duplicate file names (the cloud don't like duplicates)

5. Used the POST /files/{auth}/cmis/repository/{repositoryId}/folderc/snx:files API to upload the file
6.Then use the API to TAG the file with the old folder structure name

Job Done. 
Quickr Docs were more problematic and required a placebot to dump the docs to text files and then uploaded using the WIKI Post API.

Once a community had been populated I added the Quickr place managers to the community and showed them how to work it, once they had cried about the lack of folder nesting and saw how fast TAGs can be searched for, they sucked up their tears and got on with it and have been using their communities in anger for some months.


One thing became clear very quickly, the lack of a Mail-In function was a bit of a bollox to the quickr place managers. I have something in test that allows an on-prem mail in DB that has an agent that detaches any attachments and takes the mime text and posts to a given community as FILES and a BLOG entry with a link to the FILE (if any). The BLOG post is posted as cloud user call "AVX Auto-Post" and the original sender internet address becomes a mention pre-pended to the body and the subject (minus the FWD and RE) prefixes becomes the subject of the Blog.

We will be using the same process to post updates from the "internet of things-that-go-beep" on the shop floor to post to communities of engineers and manufacturing managers, so they can be notified promptly about issues and discuss it in the cosy shared confines of a community rather than in 101 emails. We have done a POC and got a rather nice "Wooo! that's good" which always does the soul good.


The other thing that has been tiring but fun is introducing my users to Connections, but that is enough for now. I tell you all about that in the next post

No comments:

Disqus for Domi-No-Yes-Maybe