I just ran across a term that is new to me. The term seems to have been around for a long time. I’m seeing references from 2014. The term “fog” is used to refer to a hybrid of cloud computing and local data center computing.

Where “cloud” computing has traditionally referred to as moving computing & storage from my local computer to big company’s data centers. Companies like Google & Amazon are traditionally known for they’re cloud services. Cloud works nicely for mobile computing. For small enterprise level things. For the “Internet of Things”. It is not so nice for a lot of things though. This is especially true when ‘cost’ is one of the criteria for determining ‘nice’.

When you can buy a computer and use it for up to 5-7 years for just 7 month’s fees of renting from a cloud provider, it seems strange why anyone would use the cloud. Maybe it’s that you can set it and forget it, having someone else determine your patching schedule? Maybe it’s because you can spin down systems that are seldom used, for example development computers for a stable app? So, what’s the alternative to cloud?

As mentioned above, you can buy a computer. Rent a data center. Buy bandwidth. Pay for electricity & cooling, etc. For a small shop, that’s expensive. For a large shop, the expenses for individual computers rather falls in line with ‘it’s all about quantity’.

There are things that don’t work well in the local data center. Being rapid to market is difficult. It takes weeks to have a computer delivered & racked. It takes seconds to create the equivalent in the cloud.

Introducing “fog computing”. You need to be fast to market? Do the cloud. You have a system that is running in the cloud, and you want to cut back on costs. Order your computer, and when it’s racked, move it to the local data center. Continuing the example, you want to dynamically launch many small application servers to connect to large database in the most cost effective method possible? Maybe, put the application servers in the cloud, and the database server in the local data center. You’ve just been introduced to fog computing.

It’s funny how catchy names get the attention of decision makers. There have been many companies rushing to the cloud just because the CEO read something about the cloud in a magazine. So, now, reasonableness has a catchy name. If it works best in the cloud, put it in the cloud. If it works best in the local data center, put it in the local data center… and it’s all in the fog! Google it!

Keywords: cloud, fog, data center, hybrid, storage, data center, marketing, ceo

Troy Frericks.
blog 28Sep-2016
Copyright 2015-2016 by Troy Frericks,



I’ve recommended against the use of Transparent Data Encryption (TDE) database container encryption (SQL Server & Oracle) in the past. Reference my blog or blog. TDE flat out provides no protection while the database is running, and we all strive for 24×365 uptime.  Given the option, any bad guy is going to steal your data through the front door rather than shut down your database and raise the ‘something is wrong’ alert. Of course, follow normal procedures for encrypting database backups that will be moved off-site, and destroying or wiping abandoned storage.

Adding a database to an existing availability group is another one of the hassles that TDE causes for the very little (or no) benefit it provides. Specifically with SQL Server 2014 and prior, one can not use the standard SQL Server Management Studio (SSMS) wizard to add a TDE encrypted database to an AlwaysOn Availability Group (AAG).

In my situation, I was lucky. We choose to encrypt the second database with the same key. The first database was in the AAG, and the AAG was functioning perfectly. The problem is the wizard had the second database greyed out… as if it was not a candidate for AAGs.

So, here’s what I did to add the TDE encrypted database. (Remember to test your interpretation of these instructions on a non-production AAG.)

  1. Insure that your first database is TDE encrypted, the AlwaysOn Availability Group is functioning (synchronizing) properly. This means that TDE keys have been placed on all nodes (ie replica/instances) of the AAG. 
  2. Insure that the current ‘primary’ node has been set to the same node as the node with the second database. (There is a ‘no data loss’ method of manually failing an AAG.)
  3. Insure the second database is TDE encrypted with the same key as the first database. As the first database is functioning in the AAG, the keys that exist on all the AAG nodes are the same keys that are needed for the second database.
  4. On primary node of the AAG…
    1. Insure the second database is using FULL recovery.
    2. Use the backup wizard to take a full backup of the second database
    3. Use the backup wizard to take a tlog backup of the second database
    4. Add the second database to the AAG with the following TSQL command… ALTER AVAILABILITY GROUP aag_name ADD DATABASE database_name;
  5. On each of the secondary nodes of the AAG…
    1. Use the restore wizard to restore the full backup taken of the second database. INSURE ‘NORECOVERY’ is specified.
    2. Use the restore tlog wizard to restore the tlog backup taken of the second database. INSURE ‘NORECOVERY’ is specified.
    3. Add/Join the second database to the AAG with the following TSQL command… ALTER DATABASE database_name SET HADR AVAILABILITY GROUP = aag_name; 
  6. Insure that the secondary nodes are synchronizing (after ‘refreshing’ in SSMS). Test a manual failover.

It’s far simpler to not have to deal with TDE in a AAG… It’s especially frustrating given that TDE offers no real security. (yea, the ‘dead horse’ thing!).

Keywords: Always On Availability Group, AlwaysOn Availability Group

Troy Frericks.
blog 01-Jun-2016
Copyright 2015-2016 by Troy Frericks,