Tag Archive: RemoteApp



This is the second blog post discussing the change from a .local Domain to another namespace in order to better support the Mac clients in our Domain.

 Planning for the new Domain

The initial decision to move forward with the Domain change was easy compared to what was next.  The next challenge was how and when based on our people resources, major church-wide events, and launching of our 3rd campus.  As far as the life of Clear Creek goes, there are not any very “slow” times in which a project like this would be easy.  The next consideration was the when and how to do this to get the most of our volunteer resources and asking Solerant (IT Contractor).  This is important because while I have some technical skills, I’m really more of a project or resource manager.

  • We decided to deploy on October 9 & 10 because we could start after church services on Sunday and several of our volunteer team could also be present to help on Monday because they had Columbus Day off.
  • We also decided that migrating the Microsoft Exchange Server (E-mail) was going to probably be the most challenging part of the migration and pose the most problems.  I thought that Solerant would be best to do this because they already handle many Microsoft Exchange upgrades each year and that they could handle the added complications that went with a Cross-Domain migration of Exchange.
  • I polled the staff to see if making a huge change like this would seriously impair ministry.  I did my best to explain that everything was going to change, but I don’t know if there is really a way to explain just how much was going to change.  I tried to think of “what the worst case scenario” implications would be to best prepare them.

Once I made the case with my boss and laid out the basic plan we moved forward.  The team normally meets on Monday nights, but over the next few weeks we also met an additional night in order to account for everything that would need to be done.  Sunday the 9th would be the day that the most manpower was needed, so I wanted us to be ready to roll since that was going to be a long day.

Thankfully, in our infrastructure we have an Equallogic SAN and Virtual environment using VMWare.  If we had not had this environment, it would have cost a whole lot more to change our domain at the size we are now. I think we had to add five virtual machines to complete the migration.

Creating a New Domain

The first thing we did was create a new Domain.  We “spun up” a new server in our virtual environment and made it the new Domain Controller.  We then needed to create a “trust” between the new domain and the existing domain.  In order to do that we had to remove the .org from the existing domain’s forward lookup DNS records and setup forwarders to the new Domain’s server.

Extending the Active Directory Schema

This time around, we decided that maybe the Magic Triangle was causing too much delay for the Macs to find resources and that maybe extending the Schema in AD was a better option.  We used Apple’s white paper to change 40 attributes necessary to support Macs in the Domain.  To me, this sounds like a lot, but now I understand that this isn’t in the big picture world of a Domain.

Valuable links:  Apple’s KB and a good blog post, but note that the Dynamic UID problem is not true anymore with a recent patched OSX.  This is how to modify the schema, and the thing we need to do still for applying WGM policies to computer list.

Mimic Necessary Foundation Services

So now we had a domain consisting of a single machine that could talk to the other domain and we could find resources between them.  Now we begin making duplicates of resources that we won’t migrate

  • We created a certificate authority with Web certificate component
  • Network Policy Server (NPS) in order to prepare the way to move our existing Wifi system.
  • We cloned the Print Server and added it to the new Domain which saved huge amount of time with adding drivers and creating shares.
  • Evaluated/Exported all of our Group Policy Objects (GPO) to determine which ones needed to come over, which ones weren’t working, and how we could simplify.
  • Evaluated our Active Directory structure since we had a lot of legacy structure that was not clear, possibly structure from Small Business Server a long time ago.
  • Decided that we would eliminate whatever XP support we could and get the rest of our machines to Windows 7 where possible.  We are currently down to 3 machines with Windows XP with plans to get them out and 5 Vista computers which we will be able to image very soon.
  • Created a new Windows 7 VM and installed the Active Directory Migration Tool (ADMT) from Microsoft on another virtual server joined to new domain.
  • Decided we would migrate Microsoft Exchange as the last step since they were unable to do it beforehand and I only had a short window for my volunteers to help.  In hindsight, it would have probably been best to migrate Exchange FIRST, so that we could migrate the users with their Exchange attributes.
  • We wiped our physical Domain controller and loaded it with Server 2008r2 to match all of our other servers the week before the migration.
  • The morning of our migration, demoted another Domain controller in our .local domain so that we could keep the new Domain pointing to similar DNS to avoid problems with more devices.
  • We made sure we had a second virtual Domain controller on our other Virtual Host server so that we would be prepared if we had any power issues.  We agreed that it would not be a good idea to move a domain controller from one domain to another. It never seems like those things demote properly in my experience. This added a little bit of extra work with re-creating the DHCP server, the options, and his subnets.
  • Joined some test machines as clean, new clients to the new domain including a some Macs: Lion Mini, a Snow Leopard mini, and a Lion Server Mini.  We wanted to get things right with Lion from the start.

What would end user experience during migration?

Since I was not able to quickly find information online about what a user would expect when all of this occurred, I put myself in their shoes in order to make this go for them as smoothly as possible.  My team was busy working on the major aspects of the migration, but they really aren’t wired to think about how the regular person who just wants their computer to do its job might experience with EVERYTHING changing.  I started off with what I know…which is when you add a computer to a domain, there is no user profile.  My assumption became that if you took the same computer from one domain and moved it to another, they would likely have a brand new profile and not see the same desktop as they did the last time they logged in.

  • For most users, this would mean that they would essentially not have anything in “My Documents” or any of their personal settings or see anything they kept on their desktop.
  • Laptops and Macs were going to need extra attention.  The people who use those computers have very specific needs so more is on their specific local machines and thought it would be important to migrate their user data.  We needed tools or scripts for both platforms.  We chose ProfWiz for Windows machines and used these magic commands for Macs.
  • I also knew that because of email, they would essentially be operating as two users – one for email from the existing domain and their other account in the new domain.   I also realized that the regular user was not going to see the distinction.
  • How do I communicate with everyone in a way that doesn’t alienate or frustrate?  I decided to prepare via staff wide email and personal conversations to make sure things were clear to key people.  Then I chose to pick a person in various ministry areas that I could contact to update with the latest information should email be down or we had any major problems.  I could call or text them where to find instructions for them to forward to the people in their area.

How I prepared everyone for the migration

  • I sent out emails to staff asking them to make sure all of their documents were living on our network drives.  We’ve done a pretty good job of having users understand where to store their data.  We choose not to redirect profiles since there has been benefit for many users to put some data on their local computer that can be discarded.
  • People also forget the small things like how to set a default printer or how to export and import their browsing Favorites.  I sent instructions for both in emails ahead of the migration and made sure key people knew how to do those things or print them out so they could help each other.
  • I also sent out instructions to remind users how to report problems by using Solerant’s ticket system or to leave me voicemail message.
  • I made sure that laptop users verified that their data was on the network and for Mac users we made sure they had a full backup in case we had a problem and were not able to help them as quickly as we could.  I’m all for having a backup of your backup.
  • The users also needed to know that they would need to use Webmail (OWA) or email from their cell phones on Monday, until we completed the migration of mail.
  • I made sure I thought about all those “what if’s” on things failing.  My goal?  Have a backup plan to support the end user so they can get their job done no matter what.  We can’t afford for them to not be able to get their job done, nor can we afford to lose credibility with the users because the change was so painful.
  • There isn’t a whole lot on the actual Domain creation and migration that I can do, so I made sure my spreadsheet for all of the servers with their roles was updated and questioned how each one of them would be affected during this change.
  • I created a map of where every machine was physically located and cleaned up the current Active Directory to remove any computers and users that were no longer in service.
  • There were a few other obscure things to remind the team of, making sure the SSO for our RemoteApps were accounted for, KMS setup in the new domain, making sure any references to the Fully Qualified Domain Names (FQDN) references to the old Domain were accounted for as well.
  • During our tests we noticed that our current Domain policy was actually working in that it removed any local user privileges with local administrator rights and made them “guest” users while in the Domain.  To bypass this, I had to make sure my team had good instructions moving from computer to computer on the day of our migration.
  • I also made our Resources team (DVD/CD production of messages) and our Database Reporting Team aware. Both of those teams do not have a staff person in the office during the week, so I needed them to think about how this would affect them and be willing to test to avoid technical issues when they meet or when Sunday comes.

Implementation:

How long did it take?

This is rather tricky to answer, but since I have several people with a little amount of time, I believe it went as well as could be expected with some parts better than others.  It took about three calendar weeks to prepare, one day to move all of the users and computers, and three days to get email moved.  Again, hindsight says we should have moved email first and I still cannot guarantee you that it will be easier.

Moving users and computers

This was a very long day.  We had a good lunch and started after the last church service.  By this time we had made sure we had all of our new infrastructure in place.  I’d already re-imaged some user’s computers during the morning services.  My thought here was that since they were going to have to essentially start over, I may as well give them Windows 7 as well so they don’t have to start over twice within a few months.  There were only a handful of machines I was unable to take care of due to the software or hardware they were running.

  • Six of my volunteers (and me) helped on Sunday and which was kind of tedious process, but these guys were very good.  We worked from 1:30 pm until around 11:00 p.m
  • To start, I had three people working on servers/user migration and the other three working on clients.  It seemed I was more useful answering questions, but I did get to move a few clients.
  • One of those guys made sure we migrated our NOD32 (antivirus) so that clients would see the new server.
  • We also verified DNS for some other servers that were not in the Domain so they would work well after we were done.
  • We moved Ruckus (our Wifi system) pointers to the new Domain NPS (Network Policy Server).
  • We used the ADMT tool to migrate users, had them keep the same Security Identifier (SID), and migrated the password.  The latter created angst with my security-minded volunteers, but I really needed to protect the end user here to prevent distress.  After they migrated the users, I went through each user account and unchecked the box that requires them to change the password and removed the login script since we now push the drive mappings via Group Policy.  I have to say this GPO is fast and beautiful compared to a login script!
  • We moved some servers and workstations using the ADMT tool, but in hindsight I’m not sure that was a good idea for the phone or Remote Desktop Servers.
  • We used ADMT to migrate the file server.  I believe it was very important to move this server this way because it migrated permissions and security groups as well.  Some of folders appear to have had a hiccup and did not duplicate properly, but fixing permissions for a few directories was not as bad as having to start over.   We also consolidated some of our drive mappings to make it easier for our Mac users since they have to browse differently to get to the same shares we make appear magically for the Windows users.
  • Our Church Management system is ACS and we run it via Microsoft’s RemoteApp Remote Desktop Services option  from a Server 2008r2 server.  I created new RemoteApp packages for People Suite and Financials so they could be applied via GPO.
  • Once the servers were done, two of the volunteers started on laptop users, most of which had left their laptops in their offices (our request) to allow us to help them before Monday.  We used a tool called profwiz to migrate user profiles on Windows machines and some magic commands on the Macs to move their profiles.  I wish there was plenty of time to do this for every user, but we just had too many computers.

When we finished, we called it a night.  We had to convert some data LUN’s overnight so that we could completely move to new Veeam backup software (designed for virtual environments).  We left that running while we slept.

Final Blog Post in this series will cover:

Domain Migration:  Challenges & Successes

Advertisements

One of the reasons I really like my job is that there are such innovative approaches to helping End Users get their work done.  I think from the technical side of things we can sometimes struggle with the solutions being simple for the End User.  RemoteApps are a wonderful way to fill our need for our database to perform better for the User as well as management and maintenance by our Database Administrator.

A challenge that was becoming an increasing issue with ACS as a RemoteApp is the confusion it would cause staff and volunteers when they would move to a different computer.  The first time you login to a Remote App, you get a screen with a lot of words on it about trusting the security of the server you are trying to connect to.  After that you get another screen that wants your network credentials–which truly confuses the User!  Then, if they don’t tell it to save the network credentials (which we are okay with on site), they have to do it all again the next time they launch the application.  Paul Salvo, one of my volunteers was happy to look to resolve this issue.  Vista and Windows 7 were pretty easy to resolve, and we could push the changes through Group Policy.  Our XP clients required a registry change which we pushed using a filter in Group Policy (Thanks to the guys in the CITRT IRC Chat room who helped me figure out how we needed to push that only to XP machines).  This solution also works if your site has Users working regularly in Remote Desktop servers on your site.

All Windows Clients GPO:

This policy instructs the clients to trust a list of servers for pass-through authentication.  We applied this to the computers in our Group Policy:

  • Computer Configuration
    Administrative Templates
    System
    Credential Delegation
    Allow Delegating Default Credentials with NTLM = Enabled
    Set to TERMSRV/<FQDN of server>
    Set to TERMSRV/<server hostname>
  • We added all of the Remote Desktop Servers we wanted to trust with pass-through.  We created entries with the FQDN and entries with the server’s short name.  We’ve heard that this is recommended.

Windows XP GPO:

  • You really need SP3 and to have the latest RDC on the client machine We used this KB file for that information.

Here is the Registry we imported into Group Policy:

  • HKEY_LOCAL_MACHINEKey path SYSTEM\CurrentControlSet\Control\SecurityProviders
    Name: SecurityProviders
    Value Type: REG_SZ

    msapsspc.dll, schannel.dll, digest.dll, msnsspc.dll, credssp.dll

  • HKEY_LOCAL_MACHINE
    Key path SYSTEM\CurrentControlSet\Control\Lsa
    Name: Security Packages
    Value type: REG_MULTI_SZ
    kerberos
    msv1_0
    schannel
    wdigest
    tspkg

Then we created a WMI Filter for XP Machines so that the Registry change only applied to XP Machines.  It wasn’t necessary to apply it to Vista or Windows 7, and I’ve learned to be careful when editing the Registry.

  • Namespace:  root\CIMV2
    Query:  Select * from Win32_OperatingSystem where Version = “5.1.2600”

What this didn’t fix:

Now users double-click the ACS Icon (RemoteApp), they see that it is starting a RemoteApp, then they get the login screen for ACS.  They are mostly happy with the above changes.

The first time a user uses the RemoteApp, they still get this screen.  We think it has to do with the domain not having a Certificate Authority.  We haven’t decided if it is worth all of the extra work.  I imagine we will pursue this in the future, but right now I have my team working on other things.  Basically, the User is instructed to check the box and click Connect.


When Windows Server 2008 was released they added this great feature in Remote Desktop Services (formerly Terminal Services) that allows you to use Remote Desktop Services to run behind an application or via web access.  The idea is that the end user just launches the application, which is really a connection back to a Remote Desktop Server.

Here is what made us look into this in the first place:

A couple of years ago we were launching CheckPoint (ACS People Suite’s check-in application) to all of our children’s and students’ areas.  The challenge was that we needed to run ACS over our Wifi network because the machines had to be portable.  The challenge we have is that ACS runs as fast as its slowest computer, due to file locking in a flat database and Wifi isn’t that fast to begin with.  We needed to be able to check-in hundreds of people in in 15 minutes.  Around that time, I went to a class taught by Ken Hicks at the ACS Convention where he talked about using Terminal Services to run ACS when you have a slow network.  Right then I SMSed my key volunteers and one replied that he knew how this should be done, which we started working on that weekend.

We upgraded our database server to Server 2008 and published a RemoteApp installer package and installed on all of the check-in workstations and it works like a charm!  I won’t post the install details here, because Jason M. Lee and Cisco Ospina have done a great job of that in their blogs and are both much smarter than I am.  The sweet thing about installing a little MSI package on a user’s computer is that it looks exactly like the application, instead of a little RDC icon and they don’t have to worry about the extra window of a Remote Desktop session (which does get confusing to some end users)!

Recent Server Upgrade:

Recently, we’ve upgraded our ACS server and migrated it to a VM (Virtual Machine) and it runs even better. A new challenge hit me as I was thinking about deployment.  What do you do with the old install on a workstation when you have a new one to push?

  1. The RemoteApp install has to be done per User.  I had a workaround for XP and Vista, but it really needs to be done Per User for it to function properly.
  2. The package is the same name, so it would be hard for the user to know which icon is the old one.

We are currently in the process of pushing the new installer package through Group Policy.  I learned that I can rename the RemoteApp package installer and the Icon (once installed).  For clarification, I indicated the server hostname in the installer and icon name.  This will make it easier to tell which ACS People Suite icon is the one the user needs to use.

To make those changes,

  1. Go into the Server Manager>Roles>RemoteApp Manager.
  2. If you have already added the Application to the manager,
    right-click and select Properties.
  3. The top field is what the Icon will show on the User’s desktop.
  4. The Alias field shows what the name of the .msi file will be.
  5. You can also do this at the time you add the Application with
    the wizard.  There is a Properties button in there to look for.

Publishing Quickbooks

We are growing in such a way that we needed Quickbooks to be more accessible to our staff and volunteers.  Some of the financial team needed to be able to share the application as well as have multiple people work on the Fixed Assets which live on one user’s local machine.  To resolve this, I installed it on a Remote Desktop Server.  Now all Quickbooks users have access to the Fixed Assets.  There are still some quirks to work out because Quickbooks likes more security holes open that typically a Remote Desktop Server doesn’t like.  The downside is that Quickbooks doesn’t support this version using this method.  I think they support the latest version.