Microsoft

Microsoft Introduces Professional Degree in Data Science

With the adaptation of IT becoming more Business Technology, Microsoft is nearing releasing a certification program for Data Science.

According to Microsoft: “Microsoft consulted Data Scientists and the companies that employ them to identify the requisite core skills. We then developed a curriculum to teach these functional and technical skills, combining highly rated online courses with hands-on labs, concluding in a final capstone project. Graduates earn a Microsoft Professional Degree in Data Science—a digitally sharable, résumé-worthy credential.”

It appears that this will require 10 Courses to be taken, to the tune of about $525 in total. It’s not a bad deal when considering this is the heavy focus of most higher level IT Professional’s now a days. Good On Microsoft for getting this program rolling. As of today you can sign up to be notified when it goes public via: https://academy.microsoft.com/en-US/professional-degree/data-science/

 

Integrating PowerBI with Office 365 Monitor

Integrating PowerBI with Office 365 Monitor

It will then prompt you to login. You need to use an account that has the rights to Reporting within Office 365.  Second to that, you also need to register for the Office 365 Monitoring Service via: https://www.office365mon.com/Signup/Status.

Initially I was unaware of this step. That being said, I was receiving the following error:

Expiring Links - OneDrive for Business

In a welcome surprise today notice that the "Get a Link, or sharing functionality within OneDrive for business has been updated.  Previously there was a drop down list that was View Link, or Edit Link.  If you wanted to require a Sign-In there was a check box you could enable.

Now those options are

  • View Link - Tenant ID 
  • Edit Link - Tenant ID
  • View Link - No Sign-In Required
  • Edit Link - No Sign-In Required

Get a Link - Updated Choices December 2015

You can still share a link outside your organization, but must use the "Invite People" option now vice just requiring sign in.

Another aspect, and this is the one that was long overdue and perfect, is the ability to have the links expire.  You can now have the link "expire".  Now this is good for public links, where you need to share information.  That happens to be what I was doing when I noticed this.  I needed to share a video of an issue with a product to the support team.  In this case, I can upload the file, and give a 30 day link that I know will not be available in the future.  

Set links to expire - Updated December 2015

The only thing I wish this also applied to, would be the Invite a user portion as well.  There are times when you could be collaborating on a project and you want to share the file for a day or a month and it would be nice for that expire as well.  Here is to hoping that's what next.

OneDrive For Business - Rename or Hide Icon in Windows Explorer

OneDrive4BusinessRemove.png

When you have OneDrive for Business installed in Windows 10, you get a nice OneDrive for Business Icon. For those of us who have remapped (relocated) the Windows Libraries to our end users OneDrive for Business accounts, this becomes a redundant icon. To remove this icon you need to go into the registry editor and navigate to:

[HKEY_CLASSES_ROOT\CLSID\{3BA2E6B1-A6A1-CCF6-942C-D370B14D842B}]

Alter theSystem.IsPinnedToNameSpaceTree to the Value of 0

If you just want to rename the icon, change

(Default) to the Value you wish the icon was named.


Office 365 vs. InTune Important difference

Office365_MDM_Builtin.jpg

This past week I had been attempting to use Office 365's MDM Solution so that I had selective wipe available. In the past I have always used a third party option for MDM, such as Airwatch, Meraki, Maas360. That being said, I thought it was time to attempt to use the built in function with in Office 365. However it's important to know there is a huge difference between Office 365 MDM vs. InTune. The important thing to know here is that the built-in MDM solution in Office 365 currently only supports the Outlook for Android\IOS apps, not the native e-mail client. I naturally assumed and had trouble finding where it distinguished this. After running into some issues I finally found documentation on where this was the case, but it wasn't easily available. If you want the Native apps to be managed via Office 365, you need to use Intune in order to do that.

A quick recap.... If you want Native Mobile App support, and other device types to be controlled you need to use Intune. If you just want to control the apps for the devices then you can use Office 365 built-in MDM.

Office 365 - Disable Clutter Feature Globally

O365.png

In a conversation today, I was asked about Clutter, what it was, why it was there... The short answer it, it is your inbox learning from your patterns on messages you are likely to ignore. To me, this is pretty similar to the Junk E-Mail folder. I know technically they are completely handled differently, however an End User ask this question

Why do I have to check mail in three spots, Inbox, Junk, and now Clutter?

That being said, it maybe useful to turn off the Clutter Feature accross the board. Anyone who has been in IT for any length of time knows how many times the phrase "Did you check your Junk Folder" is said. Instead of adding "Did you check the Clutter Folder" we could just turn it off.

The hard part is, there is no global function to disable Clutter, it is considered a "Per user" feature. This is where we need to turn to powershell. Keep in mind you need to connect your powershell to Office 365\Exchange Online. Instructions for that could be found here.

For this script, I merely created a variable and added the users, then used that variable in the Set-Clutter CMDlet. I was forced down this path as the Set-Clutter CMDLet doesn't accept a pipe to it.

#########################################
#                                       #
#        Connect to Office 365          #
#                                       #
#########################################
$msolcred = get-credential
connect-msolservice -credential $msolcred
#########################################
#                                       #
#            Exchange Online            #
#                                       #
#########################################
$UserCredential = $msolcred
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxymethod=rps -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session
#########################################
#                                       #
#        Change Clutter on\off          #
#                                       #
#########################################

# Disable Clutter processing for a mailbox
Set-Clutter -Identity MAILBOXID -Enable $False

# Check that it's done
Get-Clutter -Identity MAILBOXID | Format-List

# Globally
$users = Get-MsolUser
$users | foreach { Set-Clutter -Identity $_.UserPrincipalName -Enable $False}

References: Clutter Notifications in Outlook - Microsoft Connect to Exchange Online Remote Powershell - Microsoft

Cloud PBX with PSTN for Office 365 - Skype for Business Preview

CloudPBX1.png

Using the Cloud PBX with PSTN as part of the of Skype for Business Preview has been great the first 48 hours of the program.  As many know, this is something something I have been wanting for a long time.  That being said there are definitely some pros and cons in the first wave of the preview, some of which such as E911 are to be expected and others like Auto Attendant are disappointing. Starting with the bad since I know for sure at least one U.S. In a future build.  Voicemail, currently this is unavailable which can be frustrating.  One of the suggestions for being a preview tester is to use this as your daily phone.  As a result I would like to transfer my current PBX desk phone to my Skyoe number but with no voicemail that isn't an option unless I want to manage that every time I get up.  I am going to see if I could use simulring with the production PBX and get the timing right in order to leverage existing voicemail but still leverage Skype to take all the calls.

One of the most used features of the PBX is the IVR or Auto Attendent.  In this first wave release we do not have access to any of these functions.  I haven't been able to find anything that says of this will ever be available but in the Tech Preview slide show they mention it as not available in this release which allows me to believe it will be in later releases.   That function being missing basically only allows you to test basic call quality and functions.  Personally this is one of the features I wanted to look at from an evaluation perspective.

On to the good, everything else!   The call quality is what you would expect, great.  Just like your typical Skype or traditional PSTN, I didn't have any latency or crackling that you can sometimes get with VoiP calls.  Now I have made calls from a 100Mbps fiber, Comcast Home, and while using my phone as a 4G hotspot.   To me that is good news, that any solid connection can handle the call with the same level of quality.  The same level of quality could be said when making calls via the Lync 2013 iPhone client.  Wether on a Meraki Wireless AC AP or through Verizon's wireless network, the quality was the same.

What I have found that I really like about using Skype For Business in general is the portability, for instance taking the calls on any PC or device with the Skype for Business Application being installed.  During part of my testing I was able to test at multiple locations; home via a laptop, work via a desktop, on the road via both the laptop and the mobile client.  Throughout  all those tests the connections were solid, but more importantly, it was independent of the connection back at the main office,.  Where if this was an on premise solution would be dependent on the connection and equipment to which the server was attached too.

One of my overall positive points regarding the cloud in general, is that in most situations the IT Staff that is maintaining and caring for the cloud service is far greater then the internal staff.  Despite what any IT Pro may want to think of themselves and there knowledge the major cloud providers have hundreds and thousands of people at the ready in case of an outage.  That's not to say that those of us that are internal IT Staff to a business are not more then qualified, it is just meant to illustrate that there is power in numbers.   In this scenario a similar thing could be said, regarding the amount of bandwidth, and the quality of connections that Microsoft has in the Azure cloud is far greater then most if not all other business'.  Point being, is that there are multiple avenues to consider bandwidth for a company, what they need coming down, what they need going up, and the capacity of remote access at any given point.  Much like everything else there are usually sacrifices given to the "potential" situations vice the norm.   By having your PBX in the cloud, you are eliminating the impact of remote access on top of DR bandwidth needed because that would solely be put into formula for how many devices are at the location.  To me, this is much easier to justify and manage from a consideration standpoint.

I touched on another point that deserves some independent consideration, and that is the Disaster Recovery Aspect.  The fact that the PBX is hosted in the Microsoft Cloud, it takes away the concern of redundant systems in the companies DR plan.  When you consider what could be involved in disaster recovery regarding phones you have considerations of not only just the equipment but the copper, or bandwidth needs in the co-location for DR.  Most of the time those connections are just idle, and to have a comparable amount of bandwidth available in your DR site could be a quite expensive monthly cost for something that hopefully never gets used.  That being the case, when considering the cost fact of using Skype for Business Cloud PBX you also have to balance that against the in house solution, plus your Disaster Recovery costs for that same service.  Being an Enterprise Customer, as it's laid out now, there really isn't a decision there, the cloud is your friend.

In the first 48 hours, Microsoft has confirmed and validated my excitement for Skype for Business Cloud PBX.  I am anxious for the second wave of features to roll out so that I can confirm this is the solution I've been dreaming about!

Update!

I can confirm that if your PBX is set to forward "Original Caller ID" vice the Global Caller ID Skype will respect the incoming caller id from the original call.  Bonus!

Quick Setup - Skype for Business Preview - Cloud PBX with PSTN Calling

skype_for_business.jpg

July 1st Microsoft opened up the preview for Skype for Business which allows select customers who have an E4 License or Skype for Business Plan 2 to enroll and test out the following: Skype Meeting Broadcast: 10,000 person internet meeting, with integration into Bing Pulse as well as Yammer integration.

PSTN Conferencing: Allows you to create a Skype for Business meeting with a dial in number when your Skype resides in the cloud.

Cloud PBX with PSTN Calling:  This is the ability to make and receive traditional phone calls via Skype for Business without having to have a traditional on premise solution.

I was able to get into the preview program for the last two, Broadcast Meetings is not something that applies to my current situation.   I have been wanting for awhile for the Cloud PBX to come to fruition, as I find that in this day in age of the cloud, that being able to offload this functions is great from a management standpoint but even better when considering DR Solutions.  Telephony is usually the hardest to place in a Disaster Recovery situation as it requires you to maintain a secondary circuit, and to a certain extent, a secondary location that is available if needed.  For most small and medium size business the expense tends not to be outweigh the risk. In the case of Skype for Business PBX, it lives in the cloud and thus your disaster recovery location becomes anywhere you have a quality internet connection.  Considering SMB's that could be people working from home, or multiple smaller venues via computers.

Dial In Conferencing:

From a configuration standpoint, the process was pretty painless.  Upon acceptence into the preview program you receive an e-mail with some one time codes that you need to enter into your Office 365 portal.  This works just like adding an additional license of any of the Office 365 products.  Once you have enabled the PSTN Conferencing you will notice that your "Microsoft Bridge (preview)" will have a list of numbers from different regions.

SkypeforBusinesspreviewDialInConferencing

 

Once those numbers are available you can then go into the user properties, dial in conferencing and select the provider of Microsoft. Once you have selected Microsoft you can select from one of those numbers.

Skype for Business Dial in conference user properties

This becomes the number that is added to all of the users Skype for Meeting requests.  You will also notice that a Passcode is created for that user as well.  This is to allow for multiple users to use that dial in conferencing number.

Cloud PBX:

When the one time code mentioned above is applied to your account you will then see a "Skype Voice" option on the left hand side of the Skype for Business Dashboard.

img_55975ba705067

Upon Clicking this, you will see phone numbers (preview) and voice users (preview).  Under the phone numbers option, you will see a blue button that will allow you to add new numbers by region and area code.

When you want to add a phone number to a specific user, you can select the voice users (preview) tab and find their name.  If they do not have a phone number currently you can assign a number, if they do you can change\remove.

img_55975cef9d9ee

At this point the next time they login to the the Skype for Business Desktop Client, or Lync 2013 client on the iPhone they will see the dial pad, and have the ability to send and receive calls through Skype.

Microsoft Ignite - Cup half empty?

Ignite2015-e1435553400136.jpg

Despite this being the ignaugural year for Ignite, to me, it's another year of TechEd which I have usually enjoyed the last several years.   Typically this is where we get a glimpse of what is to come, maybe even some larger then life opportunities.  This year however seemed greatly different and to be honest, I'm not entirely sure why.

Initially I thought maybe because it was in my home metro area.  As nice as that may seem to some navigating, Chicago is a chore on a good day.  For instance, I'm in the suburbs, the first day the 40 mile drive took over 2 hours.  The second day I used public transportation.   That means an hour and 10 minute train ride, 7 block walk to the green line, finishing that up with a 10 minute walk to McCormick place.  All in less then 2 hours and not nearly as frustrating as the traffic.  Compare that to waking up at the hotel and a 5-10 minute walk, it's almost nicer to travel.  And don't get me started on the cabs... Just check Twitter on that front.  Oddly enough as crazy as they are us locals drive like that to stay alive so we are some what used to it. 

Commuting aside, it didn't feel much better while I was there either.  The lines always seemed longer then normal to me, and finding where you needed to be was just as difficult.  Nothing like waiting in line for 20+ minutes for food to then ask yourself what is this crap.   Normally the lunches have always been somewhat foreign, but only because they were suppose to represent the local cuisine.  The quality of the food was always pretty good.   Not this year, the food was horrible, and that assumed there was still some by the time you got to the pick up line.   Combine that with it not even being close to the normalcy of what is available in Chicago.   We have some great local only chains, and not one brought in for lunch, but there was fried pickles?!?

There was some good.   I thought the keynote was good.  I found it great to see Microsoft adapting to the industry instead of trying to bend us to their will.   For that I have to give a lot of credit.  The upcoming changes with Windows and Office 365 are great improvements.   

There was also a few good lectures outlining things such as Nano Server and powershell.  However those were from the usual suspects who are always great.   Compare those lectures to the "Deep Dive, ask the experts about Oulook" session and you can see where the contrast comes.  Here we had some brilliant minds but the only answer we heard the whole time; "it's on the road map, don't know when". 

That seemed to be the theme in any of the questions I had or heard during my time at Ignite. This became increasingly frustrating.  They show some great things, have some good new products but no answers.   Now I do understand if they start outlining time frames that it sets an expectation.  However we are all reasonable people and if it was laid out that we are hoping to have this then or here is the priorities so when you see this you know that is next.   Referring us to a road map that is online and doesn't mention 75% of what is in the discussion isn't exactly helpful.  

My wish, is at events like this to realize that no matter how great things are "going" to be, and no matter the hard work being put in, we are the boots on he ground.   In order for us to plan projects, keep the business at bay waiting for the great solutions from Microsoft coming down the pipe, we too need to be able to set expectations.  As good ole Jerry McGwire would say, Help me help you! 

Powershell - Adding Pictures to Office 365 Accounts

powershell.jpg

How many times have you corresponded with a co-worker, and either as a result of company size, or physical location not recognized the person when you met them?  Doesn't it feel at sometimes like we have a relationship with someone who we would never recognize if we met them on the street?  These are the two primary reasons that adding images to your Active Directory, or Office 365 Accounts could prove beneficial. Second to that Social Uniformity both within the business and personally becomes more important as we are presenting online personas to the public.  We have all had the misfortune of "Googling" a person to find out more about them, whether it's for a hiring choice, or an upcoming meeting.  There is something to be said when there is a uniform presentation of who that person is vice, a professional image, and that of a wild night partying.  After all we are only human and consistently have outside factors sway our opinions and perceptions.

As an Example, Outlook does a great job of adding pictures from not only social media, but also within AD to present a face to the e-mail address.  As illustrated below you can see up in the "To" Section you can see the picture of the person who sent the message, then again at the bottom all those who are part of the conversation.

img_55355ca59719e.png

If this person was part of your organization who you communicated with often, this would give you the insight to recognize them that next Holiday Party or Company Picnic.

Another option is to use the Social Plugin from within Outlook to connect your contacts to LinkedIn.  This will also bring in pictures if the user information matches a LinkedIn Profile.  This is where the uniformity should take place, a company picture that is  uploaded, the same or similar professional picture to both your Linkedin Profile, as well as your company website should you be featured.

Below is a very easy powershell script to upload pictures to each of your end users.  It assumes that you have done the following:

  1. Set the files in path directory to match the UPN (UserPrincipalName of the user)
  2. And that they are jpg files.  If they are not you just need to alter the extension in the script.
######################################### # # # Connect to O365 # # # ######################################### $msolcred = get-credential connect-msolservice -credential $msolcred ######################################### # # # Exchange Online # # # ######################################### $UserCredential = $msolcred $Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxymethod=rps -Credential $UserCredential -Authentication Basic -AllowRedirection Import-PSSession $Session ######################################### # # # Set $Path to Path # # # ######################################### $users = Get-MsolUser foreach ($person in $users) { $path = "<PATH TO FILES>" $user = $person.UserPrincipalName $photo = $path+$user+".jpg" Set-UserPhoto -Identity $person.UserPrincipalName -PictureData ([System.IO.File]::ReadAllBytes($photo)) -Confirm:$false
 
Test

Bulk Folder Permissions Changes

powershell.jpg

How to  use:

  1. Copy attached BulkSet-NTFSPermissions.ps1 script to C:\Temp
  2. Open Powershell
  3. Run this command:List Folders to File
    Get-ChildItem REPLACEWITHPATH | Where-Object {$_.psIsContainer} | Select fullname | Out-File c:\temp\FolderPermissions.txt
  4. Open the c:\temp\FolderPermissions.txt File
  5. Remove the First 3 lines.  The first line is whitespace, the second says Full Name, the third is ----.
  6. Open Powershell and navigate to c:\temp
  7. Run the following command with the persons user id.
    .\BulkSet-NTFSPermissions.ps1 -FolderListFile c:\temp\FolderPermissions.txt -SecIdentity "Domain\Group or User" -AccessRights "FullControl" -AccessControlType "Allow"

Below is original Syntax of Command

.\BulkSet-NTFSPermissions.ps1 -FolderListFile x:\xxxx\xxxx.txt -SecIdentity "Domain\Group or User" -AccessRights "FullControl" -AccessControlType "Allow"

Here is the options.

  • FolderListFile: a flat text file containing the list of path that need to apply the NTFS permission. It needs to list one folder per line. the path can be a absolute local path such as C:\temp or a UNC path such as \\computer\C$\temp.
  • SecIdentity: The security identity (such as a user account or a security group) the permission is applied for.
  • AccessRights: type of access rights, such as FullControl, Read, ReadAndExecute, Modify, etc..
  • AccessControlType: Allow or Deny

Exchange 2013 Migration via Powershell script based upon search.

ems.jpg

ExchangeMigrationWeb.ps1Overview:

During a migration from 2010 to 2013 we were working on changing some of our e-mail retention policies.  We had implemented journaling through a Barracuda Message Archiver to retain our messages per company policy. Second to that, we also wanted to migrate our e-mail storage from our existing mentality of just letting people manage an unlimited "pot" of e-mail. This isn't very cost effective for one, second to that it doesn't make for an Exchange Environment that is easy to manage and project future costs.

Because of this we were going to finally put into place e-mail box quotas to force people to clean up their mailbox.  We already had retention policies in place, however our average mailbox size was still well over 2GB.  That being the case we decided to set a max size of 2GB in order to allow for the future projection of growth, and keep a relatively static cost regarding our high end storage that is hosting our DAG.

The first issue we ran into (Other then how to deal with lowering those over 2GB) was how to migrate forward while at the same point dealing with those boxes that were larger.   Even though we could look into exchange and get a list of all the mailboxes that are currently below the 2GB quota, to have to parse through the Migration Job Wizard and manually select all those users would be tedious.  So... a script is in order to handle this for us.

The "how":

Well even as great as Exchange is, it doesn't make it easy to accomplish this.  The "TotalItemSize" property that contains the full mailbox size is stored within the Get-MailboxStatistics CMDlet.  However the New-Migration, or New-MigrationBatch CMDlets require an e-mail address in order to process a migration, and that is NOT stored in the Get-MailboxStatistics CMDlet.  There are several "commonalities" between the various CMDlets, such as GUID, Display Name and so forth, however we decided to use DisplayName from Get-Mailbox.

Essentially what we did was run Get-MailboxStatistics with a filter based upon the TotalItemSize being less then 1.5GB and not already existing in the new databases.  We then ran the Get-Mailbox Command to return all mailbox DisplayNames, and compared the two files in order to build a text file that could then be ran to return all of the "PrimarySMTPAddress" from the Get-Mailbox command to have the correct information needed to do the migration batch file.

Below is a snippet of that code.  You will also notice that there was some triming and parsing of the file in order to translate from the output of the Get-MailboxStatistics to the format needed to run the loop to pull the e-mail addresses.

###     SET YOUR VARIABLES FOR THE SEARCH CRITERIA      ####
$ServerSearchVariable="*ex2013*"
$TotalItemSizeVariable="100MB"

###     SET YOUR VARIABLES FOR THE COMPARE and IMPORT      ####
$CompareFile="c:\temp\compare.txt"
$PrimarySMTP="C:\temp\PrimarySMTP.txt"
$MigrationEmails="C:\temp\MigrationEmails.txt"
###     Do the compare of MBStats based upon Total Item size set above and the server name variable
Write-Host -foregroundcolor Yellow "Running the compare to gather the list of users who will be part of this migration"
$MBStats=Get-Mailbox | Get-MailboxStatistics | Where-Object {$_.TotalItemSize -lt $TotalItemSizeVariable -and $_.ServerName -notlike "$ServerSearchVariable"} |Select-Object DisplayName
$MBName=Get-Mailbox | Select-Object DisplayName
$FileCompare=Compare-Object $MBStats $MBName -IncludeEqual
$FileCompare | Where-Object {$_.SideIndicator -like "=="} | Out-File $CompareFile
###  Here I am Trimming the file to get it ready for the comparison
Write-Host -foregroundcolor Yellow "Trimming and parsing file"
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "@{DisplayName=", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "}", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "InputObject", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "SideIndicator", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "-----------   ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " --  ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " ==  ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ? {$_.trim() -ne "" } | Set-Content $CompareFile
###     Comparing the Get-MailboxStatistics search to the full list of e-mail addresses and returning PrimarySMTP to setup the text file for the migration
Write-Host -foregroundcolor Yellow Comparing the files and translating to e-mail addresses
$FinalCompare=Get-Content $CompareFile
Foreach ($line in $FinalCompare)
{
    $smtp=Get-Mailbox | Where-Object {$_.Name -eq "$line"} | Select-Object PrimarySmtpAddress
    Add-Content $PrimarySMTP $smtp
}
### Pruning File prior to import
Write-Host -foregroundcolor Yellow "Final Pruning"
(Get-Content $PrimarySMTP) | ForEach-Object {$_ -replace "@{PrimarySmtpAddress=", ""} | Set-Content $PrimarySMTP
(Get-Content $PrimarySMTP) | ForEach-Object {$_ -replace "}", ""} | Set-Content $PrimarySMTP

The above code basically gives you a list of E-Mail addresses based upon the search criteria you set and put it's into the proper format for the New-Migration CMDLet.  The file that is created will look like:

EMailAddress
user1@domain.com
user2@domain.com
user3@domain.com
user4@domain.com
...

Below is the rest of the script (Also Attached).   The first portion of it makes sure that the location of the temp files is clean on the off chance it wasn't prior.  The last portion not only starts the exchange migration, but also cleans up after itself.

#### Cleanup of Previous files if they existed 

    if (Test-Path C:\temp\compare.txt | Where-Object {$_ -eq "True"})
    {
        Remove-Item C:\temp\compare.txt
    }
    else
    {
        Write-Host -foregroundcolor Gray "Compare.txt didn't exist"
    }

    if (Test-Path C:\temp\PrimarySMTP.txt | Where-Object {$_ -eq "True"})
    {
        Remove-Item C:\temp\PrimarySMTP.txt
    }
    else
    {
        Write-Host -foregroundcolor Gray "PrimarySMTP.txt didn't exist"
    }

    if (Test-Path C:\temp\MigrationEmails.txt | Where-Object {$_ -eq "True"})
    {
        Remove-Item C:\temp\MigrationEmails.txt
    }
    else
    {
        Write-Host -foregroundcolor Gray "MigrationEmails.txt didn't exist"
    }

###     SET YOUR VARIABLES FOR THE SEARCH CRITERIA      ####

$ServerSearchVariable="*ex2013*"
$TotalItemSizeVariable="400MB"

###     SET YOUR VARIABLES FOR EXCHANGE ENVIRONMENT     ####
$ExchDB="EX2013-DAG1"
$MigrationName="Under 400 MBv2"

###     SET YOUR VARIABLES FOR THE COMPARE and IMPORT      ####

$CompareFile="c:\temp\compare.txt"
$PrimarySMTP="C:\temp\PrimarySMTP.txt"
$MigrationEmails="C:\temp\MigrationEmails.txt"

###     Do the compare of MBStats based upon Total Item size set above and the server name variable

Write-Host -foregroundcolor Yellow "Running the compare to gather the list of users who will be part of this migration"

$MBStats=Get-Mailbox | Get-MailboxStatistics | Where-Object {$_.TotalItemSize -lt $TotalItemSizeVariable -and $_.ServerName -notlike "$ServerSearchVariable"} |Select-Object DisplayName
$MBName=Get-Mailbox | Select-Object DisplayName
$FileCompare=Compare-Object $MBStats $MBName -IncludeEqual
$FileCompare | Where-Object {$_.SideIndicator -like "=="} | Out-File $CompareFile

###  Here I am Trimming the file to get it ready for the comparison

Write-Host -foregroundcolor Yellow "Trimming and parsing file"
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "@{DisplayName=", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "}", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "InputObject", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "SideIndicator", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "-----------   ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " --  ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " ==  ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ? {$_.trim() -ne "" } | Set-Content $CompareFile

###     Comparing the Get-MailboxStatistics search to the full list of e-mail addresses and returning PrimarySMTP to setup the text file for the migration

Write-Host -foregroundcolor Yellow Comparing the files and translating to e-mail addresses
$FinalCompare=Get-Content $CompareFile
Foreach ($line in $FinalCompare)
{
    $smtp=Get-Mailbox | Where-Object {$_.Name -eq "$line"} | Select-Object PrimarySmtpAddress
    Add-Content $PrimarySMTP $smtp

}

### Pruning File prior to import
Write-Host -foregroundcolor Yellow "Final Pruning"
(Get-Content $PrimarySMTP) | ForEach-Object {$_ -replace "@{PrimarySmtpAddress=", ""} | Set-Content $PrimarySMTP
(Get-Content $PrimarySMTP) | ForEach-Object {$_ -replace "}", ""} | Set-Content $PrimarySMTP

###   SENDING NOTIFICATION MESSAGE
###   Setting Variables for the message   ###

$Smtp = "SMTP SERVER" 
$From = "noreply@DOMAIN.com" 
$CC=""
$BCC=""
$Subject = "Your E-Mail Box is Migrating"  
$Body = get-content C:\TEMP\content.html

#### Now send the email using \> Send-MailMessage  

### IF YOU NEED TO CC or BCC you can comment out the current Send-MailMessage Line and uncomment the one containing the CC and BCC arguments
# Send-MailMessage -SmtpServer $Smtp -To $To -From $From -CC $CC -BCC $BCC -Subject $Subject -Body "$Body" -BodyAsHtml -Priority high 

$NotificationPerson=Get-Content $PrimarySMTP
Foreach ($person in $NotificationPerson)
{
Send-MailMessage -SmtpServer $Smtp -To $person -From $From -Subject $Subject -Body "$Body" -BodyAsHtml -Priority high 

}

###  File pruned, need to added EMailAddress to format import file
Write-Host -foregroundcolor Yellow "Reformating Migration file"
 Add-Content -Path $MigrationEmails -Value EmailAddress
 Add-Content -Path $MigrationEmails -Value (Get-Content $PrimarySMTP)

###     BEGIN MIGRATION   ####
Write-Host -foregroundcolor Yellow "Adding Migration to Exchange 2013"
New-MigrationBatch -Name "$MigrationName" -CSVData ([System.IO.File]::ReadAllBytes("$MigrationEmails")) -Local -TargetDatabase $ExchDB -AutoStart -AutoComplete

Write-Host -foregroundcolor Yellow "##################################"

    if (Get-MigrationBatch -Identity "$MigrationName" | Where-Object {$_.Identity -like "$MigrationName"})
    {
        Write-Host -foregroundcolor Yellow "Migration Batch of $MigrationName has started"
    }
    else
    {
        Write-Host -foregroundcolor Yellow "$MigrationName did NOT START"
    }

Write-Host -foregroundcolor Yellow "##################################"
Write-Host -foregroundcolor Yellow "Cleaning Up Files"
Write-Host "Starting sleep to allow upload."
Start-Sleep 30

###   CLEANUP FILES

#Remove-Item $CompareFile
#Remove-Item $PrimarySMTP
#Remove-Item $MigrationEmails

#Write-Host -foregroundcolor Yellow "$CompareFile , $PrimarySMTP , and $MigrationEmails were removed"
Write-Host -foregroundcolor Yellow "COMPLETE"

ExchangeMigrationWeb.ps1

Script - Display File list with sizes

powershell.jpg

In Powershell we can display a list of files, such as doing a dir in a command prompt or doing a file view.  We do however have some special abilities that we can only do through powershell.  Below are a few examples.

Sort by Name with Length

Get-ChildItem FolderName | Select-Object Name, Length

Get Top 10 by size (to change the "top number" change the 10 to whatever)

Get-ChildItem FolderName | Select-Object Name, Length | sort-object length -descending | select-object -first 10

Remove Messages from Queue

Exchange.jpg

Overview:

There are times where messages become stuck in an Exchange Messaging Queue on a Transport Server.   This article describes on how to get the Queue Name and then how to remove those messages in that queue.

Example:

The first example is removing messages from the Poison Queue which can happen from time to time. In this case the name of the queue never changes and thus makes it easy. If you were to run Get-Queue -Server <EXCH TRANSPORT SERVER> it would return the Identity for the Poison Queue as <EXCH TRANSPORT SERVER>\Poison.  That being the case this is how you remove the messages:

Remove-Message -Server <EXCH TRANSPORT SERVER> -Filter {Queue -eq "<EXCH TRANSPORT SERVER>/Poison"} -WithNDR $false

IN the above example we are using the Remove-Message cmdlet and specifying the HUB Transport Server. If you have multiple stuck messages in a Highly Available configuration you would need to alter the <EXCH TRANSPORT SERVER> for each server in the array.

Something else to consider is your -Filter options  You are not limited to Queue, for instance you could choose Subject, and change the operator to -like)

Remove-Message -Server <EXCH TRANSPORT SERVER> -Filter {Subject -like "*Hello*"} -WithNDR $false

By doing this and adding the * you are grabbing any messages with "Hello" in the subject line

If you want to suppress the confirmation message just add -Confirm:$false to the end of the statement

The other command to know is:

Get-Queue -Server <EXCH TRANSPORT SERVER>

This cmdlet returns all the queues on a specific HUB TRANSPORT server.  This is important incase you have a bad queue in which you may want to manually delete e-mail messages.  You can run the same command above with the IDENTITY parameter from Get-Queue command.

Office 365 - Hybrid Wizard Fails Due To " 407 Proxy Authentication"

O365.png

Overview:

Recently ran into a problem when running the Hybrid Configuration wizard for Exchange that gave me the following error:

ERROR : System.Management.Automation.RemoteException: Federation information could not be received from the external organization.
ERROR : Subtask NeedsConfiguration execution failed: Configure Organization Relationship
Exchange was unable to communicate with the autodiscover endpoint for your Office 365 tenant. This is typically an outbound http access configuration issue. If you are using a proxy server for outbound communication, verify that Exchange is configured to use it via the "Get-ExchangeServer –InternetWebProxy" cmdlet. Use the "Set-ExchangeServer –InternetWebProxy" cmdlet to configure if needed.

Resolution:

I immediately knew that it had something to do with our WPAD Configuration, and proxy settings.  However I checked as the administrator and the proxy wasn't being used.  My initial reaction was to just bypass the proxy via rule for Exchange Server.   However a quick google search and some trial and error and found the following two options:

  1. Using your own profile, disable your "Automatically detect settings" and then export the registry key from HKCU.
    1. import to the “Local System” (HKEY_USERS\.DEFAULT) hive.  HKCU\Software\Microsoft\Windows\CurrentVersion\Internet Settings\Connections
  1. Use a utility like “PsExec” to launch Internet Explorer as “Local System”, disable the setting and save the changes.
    1. psexec.exe -i -s -d "C:\Program Files\Internet Explorer\iexplore.exe"

Limiting Active Mail Count with Message Records Management

Overview:

Currently in the process of migrating between Exchange 2010 archiving solutions. As part of this process we have decided to journal all messages as well.

However we have other factors to consider;

  • Mailbox Sizes, and their causes
  • No retention times on length of time mail stored
  • We already have in place mailbox size restrictions that allow us to have a “per user exchange cost” on the disk side, however we have ran into a problem where the stubs created by our old archiving solution (Enterprise Vault) use up a certain amount of space, so those users who have been around for many years end up loosing a huge portion of their allotted space just to those stubs.
  • The new archiving solution has limited disk space.
Side Note, we could have purchased a larger box that had more disk space, but the cost was ludicrous for something we truly didn’t need, at least at this point.

Decisions:

The question then began how do we free up the space created by those stubs? Do we need those stubs? How does it affect the client? Turns out we decided that our new product has a built in Outlook Search tool, as well as a web based search tool and more importantly the add in puts an “Archive Folder” in their folder list to show where items are located. Combine that with the journaling there appears to be no need to have all those stubs in the mailbox. We just had to decide at what point do we remove the stubs, and how much potential e-mail do we allow them to gather in their e-mail box.

It was at this point we decided how the setup was going to be;

  • 250MB Mailboxes for GenPop, 500MB for Executives
  • 10MB attachment size limit
  • No more then 90 days in Deleted Items Folder
  • No more then 365 days with in rest of e-mail box.

Resolution:

Question is how do we get there.

  • Set the Mailbox Issue Warning at 250MB:
    Set-MailboxDatabase -Identity "Server1\MailboxDatabase1″ -IssueWarningQuota 262144000-QuotaNotificationSchedule "Sun.2:00-Sun.3:00″,"Wed.2:00-Wed.3:00″
  • Set 10MB Attachment
    Set-TransportConfig -MaxReceiveSize 10MB -MaxSendSize 10MB
  • Create the Retention Policy, and Policy Tags. (More on that later)
  • Apply the Retention Policies.


Given we had to apply the retention policies on a migration type schedule we had to create a script in order to do that in stages by using a filter to return the results we wanted, parse that data, then send that to the EMS Commands. The below powershell commands can be created to a PS1 and altered for your enviorment.

### Add's Exchange 2010 Powershell Functionality to Powershell Add-PSSnapin Microsoft.Exchange.Management.PowerShell.E2010

### THIS SEARCHES AND LISTS THE ACCOUNT NAME BY SEARCHING TWO DIFFERENT LETTERS # THIS IS MEANT TO CATCH UP SINCE MRM POLICES HAVEN'T BEEN APPLIED TO EXCHANGE YET # In the future you can remove the -Filter and just do them all. # Outputs to c:\temp

$FilterContent = {$_.RecipientType -eq "UserMailbox"}
Get-Mailbox | Where-Object $FilterContent | Select-Object Alias | Out-File C:\temp\UsersToProcess.txt

#Trimming the Out-File Content for use by Exchange

(Get-Content "C:\temp\UsersToProcess.txt") -notmatch "Alias" | Out-File "C:\temp\UsersToProcess.txt" (Get-Content "C:\temp\UsersToProcess.txt") -notmatch "—-" | Out-File "C:\temp\UsersToProcess.txt" $TrimUsersToProcess = Get-Content "C:\temp\UsersToProcess.txt" $TrimUsersToProcess.Trim() | Out-File C:\temp\UsersToProcess.txt

### Setting the Retention Policy

$RetentionPolicyName = $RetentionPolicyName =
$SetRetentionUsers = Get-Content "C:\temp\UsersToProcess.txt" ForEach($RetentionUser In $SetRetentionUsers) {$SetRetentionComment = Set-Mailbox -Identity $RetentionUser -RetentionComment '$RetentionPolicyComment'}
ForEach($RetentionUser In $SetRetentionUsers) {$SetRetentionComment = Set-Mailbox -Identity $RetentionUser -RetentionPolicy 		'$RetentionPolicyName'}

### Confirmation of Results Get-Mailbox | Where-Object $FilterContent | Select-Object SamAccountName, RetentionPolicy, RetentionComment | FT

Write-Host -foregroundcolor "Cyan" 'You should see a table where everyone has a Retenion Policy that matches $RetentionPolicyName'

Exchange 2013 - DAG - Failed And Suspended

Overview:

Happen to be in the Exchange Control Panel and noticed that on our DAG it was listed as "Failed and Suspended" for the status of one of the members. I was perplexed that we didn't catch this from our monitoring or any where else, but that's a whole other issue.  My concern here was it was obviously failed.  Attempting an Update or Resume resulted in no feedback in the ECP and no change in Status.

Troubleshooting.

In this case, this is the message I saw when running Get-MailboxDatabaseCopyStatus in the EMS:

It was here, that as I mentioned above, that the status wasn't changed upon attempting to update or resume the database copy.  I attempted an Update-MailboxDatabaseCopy as one would assume would reseed the database, I even added the -DeleteExistingFiles switch specifically to start from scratch, yet recieved "The seeding operation failed... ...which may be due to a disk failure"

At this point, one would have expected I assumed there was a disk problem.  Having said that I checked to make sure the disk was mounted, even browsed it and assumed this was a typical non-descript error message.  At this point, I decided that the beauty of a DAG and having multiple copies is that I could just "whack" the DB copy and reseed from scratch.  I went through the process of Removing the mailbox DB copy by doing the Remove-MailboxDatabaseCopy:

That proceeded as expected. However as the message states I went to clear the items (specifically the logs) manually and strangely received an error message "Remove-Item: The file or directory is corrupted and unreadable"

At this point, I was surprised and decided that there actually had to be a disk issue. I browsed manually back to the location and attempted to delete a log file manually and received the same popup within Windows.  I was amazed, I actually had a disk problem. This was only strange to me because our underlying disk is actually an NetApp LUN. That LUN actually holds all three DB Copies from each of the three servers in this instance.  So for one disk to be corrupted and not all three (First off Thank God!) I was miffed.  At this point I went ahead and formatted both the Drive that contained the EDB, and the LOG files.

After confirming that the DB Copy Status didn't show the original copy still I went ahead and ran the Add-MailboxDatabaseCopy command to reseed form scratch a copy of the DB.  Wella, it worked and began copying over.

The WHY:

I suspect from looking at the log dates on the server and the time that it was last inspected that it relates to a power outage we sustained.  About 3 weeks back we had a situation where we were getting bad power from both GRIDs that fed our building, and datacenter UPS.  After dealing with bad power, our Emerson UPS decided it had enough and was toggling between battery and no battery power.  Because it was toggling so frequently it actually depleted the batteries.  Despite knowing that we left our systems up while they charged since power seemed to be okay, no flickers, nothing.  Newton struck and before the batteries had enough juice to hit sustain a brown out moment,

Exchange 2013 - Custom DLP Sensitive Information Rules

Overview:

Recently found the need to filter, or at least be aware of e-mails being sent that contained specific information.  An example would be legal matters, where certain information shouldn't be e-mailed outside of the company.  Creating a Sensitive Information Rule and combing that with a Data Loss Prevention Policy, you can have that information blocked, or at least the appropriate person or persons notified.

Overall this scenario came about with regards to building a better internal auditing system.  Creating the DLP alone isn't the only thing needed to build a more complete picture of what "users" may be doing, but only a piece of the puzzle.  In most cases you need to combine it with at least File Server auditing, and local workstation auditing to build the larger picture.

The How:

Creating and importing custom Classifications

  1. First you need to create your custom policy XML
  2. Save as XML Unicode UTF-8 file with an extension of XML.
  3. Open the XML in internet explorer if its formatted correctly you will see the XML.
  4. Then import with Powershell New-ClassificationRuleCollection –FileData ([Byte[]]$(Get-Content -path INSERT YOUR PATH -Encoding byte -ReadCount 0))
  5. Once its imported you should be able to create a new DLP policy using the EAC

Creating a custom DLP Rule

  1. Login to EAC (i.e https://mail.domain.com/ecp)
  2. Click Compliance Management, data loss prevention
  3. imageimage
  4. Click the Plus , then New custom policy
  5. image
  6. Name your policy and Choose your mode (I like to test with Policy tags), and click Save
  7. image
  8. Select the policy and click the edit your new policy
  9. Select Rules from the left
  10. image
  11. Click the to Create a new rule
  12. On the Apply this rule if field choose The message contains Sensitive information..
  13. Click *Select sensitive information types….. (if applicable)
  14. image
  15. Click the to choose from the list,
  16. You should now see your new classification

Useful Tools

The one thing I noticed that caused some issues from other examples such as: http://technet.microsoft.com/en-us/library/jj674703%28v=exchg.150%29.aspx and http://exchangemaster.wordpress.com/2013/05/15/creating-custom-dlp-classification-rules-and-policy/ is that they mention UTF-16 in the header, as well as TechNet uses a command block. I found that using either example caused an error upon import via powershell.  Notice the difference in my example below that I had to switch it to UTF-8 to get powershell to even read the XML.

Need to make sure you replace the below GUID's with self created ones form above.

<?xml version="1.0" encoding="utf-8"?> <RulePackage xmlns="http://schemas.microsoft.com/office/2011/mce"> <RulePack id="797f6b49-682c-42e4-8577-aac6eadd1428"> <Version major="2" minor="0" build="0" revision="0"/> <Publisher id="1a2d8dc3-075b-4ad5-8116-20e90314ade2"/> <Details defaultLangCode="en-us"> <LocalizedDetails langcode="en-us"> <PublisherName>Aaron Bianucci while at FHP</PublisherName> <Name>Test Keyword</Name> <Description>This is a test rule package</Description> </LocalizedDetails> </Details> </RulePack> <Rules> <Entity id="365fa6fb-9a59-4750-b82f-14647b382319" patternsProximity="300" recommendedConfidence="85" workload="Exchange"> <Pattern confidenceLevel="85"> <IdMatch idRef="Regex_Exchange" /> <Any minMatches="1"> <Match idRef="Regex_DLP" /> <Match idRef="Regex_2013" /> </Any> </Pattern> </Entity> <Regex id="Regex_Exchange">(?i)(\bExchange\b)</Regex> <Regex id="Regex_DLP">(?i)(\bDLP\b)</Regex> <Regex id="Regex_2013">(?i)(\b2013\b)</Regex> <LocalizedStrings> <Resource idRef="365fa6fb-9a59-4750-b82f-14647b382319"> <Name default="true" langcode="en-us"> Test Rule Pack AMB </Name> <Description default="true" langcode="en-us"> Test rule pack - Detects Aaron Drone </Description> </Resource> </LocalizedStrings> </Rules> </RulePackage>

Enable SSL Offloading in CAS Array

Conceptual diagrams: The following diagram illustrates client connectivity with SSL Offloading (SSL acceleration) enabled:

Configuring SSL Offloading for Outlook Web App (OWA)

To configure SSL offloading for Outlook Web App (OWA), you must perform two steps on each CAS server in the respective CAS array. First, you must add a SSL offload REG_DWORD key. To do so, open the registry editor and navigate down to:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\MSExchange OWA

Under this registry key, create a new REG_DWORD key named “SSLOffloaded” and set the value for this key to “1

Next disable the requirement for SSL on the OWA virtual directory. To do so,  open the IIS Manager and expand theDefault Web Site. Under the Default Web Site, select the “owa” virtual directory. Under features view, double-click on “SSL Settings”.

Finally, open a command prompt window and run “

iisreset /noforce

in order for the changes to be applied.

Configuring SSL Offloading for Exchange Control Panel (ECP) 

Unlike OWA, configuring SSL offloading for the Exchange Control Panel (ECP) doesn’t require a registry key to be set. Well, to be more specific ECP will use the same registry key as the one we set for OWA.

So in order to enable SSL offloading for ECP, the only thing we need to do is to disable the SSL requirement on the ECP virtual directory. To do so, let’s open the IIS Manager and expand the Default Web Site. Under the Default Web Site, select the “ecp” virtual directory. Under features view, double-click on “SSL Settings”.

So in order to enable SSL offloading for ECP, the only thing we need to do is to disable the SSL requirement on the ECP virtual directory. To do so, let’s open the IIS Manager and expand the Default Web Site. Under the Default Web Site, select the “ecp” virtual directory. Under features view, double-click on “”.

Now uncheck ”

Require SSL

” and click “

Apply

” in the Actions pane.

Finally, open a command prompt windows and run “

iisreset /noforce

” so that the changes are applied.

Configuring SSL Offloading for Outlook Anywhere (OA)

To enable SSL offloading for Outlook Anywhere only requires one step which depending on whether Outlook Anywhere already is enabled or not can be done via the Exchange Management Console (EMC) or the Exchange Management Shell (EMS).

If you haven’t yet enabled Outlook Anywhere yet, you can select to use SSL offloading when running the “

Enable Outlook Anywhere

” wizard. You can access this wizard by right-clicking on the respective CAS server in EMC and select “

Enable Outlook Anywhere

” in the context menu.

This brings up the wizard where you enter the external host name to be used and check “

Allow secure channel (SSL) offloading

”.

If you already enabled Outlook Anywhere in your environment, you need to use the Set-OutlookAnywhere cmdlet to enable SSL offloading. If this is the case, open the Exchange Management Shell and type the following command:

Set-OutlookAnywhere –Identity CAS_server\RPC* -SSLOffloading $true

Running the above command will disable the requirement for SSL for the RPC virtual directory in IIS, which means we don’t need to do so manually like it’s the case with the other services/protocols.

Configuring SSL Offloading for the Offline Address Book (OAB)

To enable SSL offloading for the Offline Address Book (OAB) you just need to remove the SSL requirement on the OAB virtual directory. To do so, let’s open the IIS Manager and expand the Default Web Site. Under the Default Web Site select the “OAB” virtual directory. Under features view, double-click on “SSL Settings”.

Now uncheck ”

Require SSL” and click “Apply” in the Actions pane.

Finally, open a command prompt windows and run “iisreset /noforce” so that the changes are applied.

Configuring SSL Offloading for Exchange ActiveSync (EAS)

Some of you may probably recall you have read on Microsoft TechNet and various other places, that it isn't supported . This used to be true but is now fully supported (although the Exchange documentation on Microsoft TechNet hasn’t been updated to reflect this yet).

SSL offloading for Exchange ActiveSync is only supported at the Internet ingress point. It’s still not supported in CAS-CAS proxy scenarios between Active Directory sites.

Configuring Exchange ActiveSync to support SSL offload is very simple. You only need to remove the requirement for SSL in IIS. To do so, let’s open the IIS Manager and expand the Default Web Site. Under the Default Web Site select the “Microsoft-Server-ActiveSync” virtual directory. Under features view, double-click on “SSL Settings”.

Now uncheck ”Require SSL” and click “Apply” in the Actions pane.

Finally, open a command prompt windows and run “

iisreset /noforce

” so that the changes are applied.

 Configuring SSL Offloading for Exchange Web Services (EWS)

With Exchange 2010 SP1 and SP2, you will no longer need to modify the web.config file. Performing the process below with the new SP1 or SP2 files will cause EWS to fail activation. To offload SSL for EWS, you only need to remove the SSL requirement from the IIS virtual directory as described in the steps above.

To configure SSL offloading for Exchange Web services in Exchange 2010 RTM, you must perform two modifications. The first one is to remove the SSL requirement for the EWS virtual directory in IIS. To do so, let’s open the IIS Manager and expand the Default Web Site. Under the Default Web Site select the “EWS” virtual directory. Under features view, double-click on “SSL Settings”.

Now uncheck ”Require SSL” and click “Apply” in the Actions pane.

Next step is to make a change to the configuration file (web.config) for the EWS virtual directory. This file can be found under C:\Program Files\Microsoft\Exchange Server\V14\ClientAccess\exchweb\ews and be modified using a text editor such as Notepad.

It's recommended you take a backup of the web.config file before you perform the next step.

In the web.config file, replace all occurrences of “

httpsTransport

” with “

httpTransport

” and then save the file.

The new SP1 web.config file contains binding entries for both 

httpTransport

 and 

httpsTransport 

that match the Binding name.  For example, there is an 

EWSHttpBinding

 and an 

EWSHttpsBinding

 now.

Finally, open a command prompt windows and run “

iisreset /noforce

” so that the changes are applied.

With Exchange 2010 SP1, you will no longer need to modify the web.config file. To offload SSL for EWS, you only need to remove the SSL requirement from the IIS virtual directory.

Configuring SSL Offloading for Autodiscover Service (AS)

To enable SSL offloading for the Autodiscover service, you must perform the same steps as those applied to the Exchange Web service virtual directory.

With Exchange 2010 SP1 and SP2, you will no longer need to modify the web.config file. Performing the process below with the new SP1 or SP2 files will cause Autodiscover to fail activation. To offload SSL for Autodiscover, you only need to remove the SSL requirement from the IIS virtual directory as described in the steps above.

To configure SSL Offloading for Autodiscover on Exchange 2010 RTM, open the IIS Manager and expand the Default Web Site. Under the Default Web Site select the “Autodiscover” virtual directory. Under features view, double-click on “

SSL Setting

s”.

Now uncheck ”

Require SSL

” and click “

Apply

” in the Actions pane.

Next you need to change the configuration file (web.config) for the Autodiscover service virtual directory. This file can be found under 

C:\Program Files\Microsoft\Exchange Server\V14\ClientAccess\Autodiscover

 and be modified using a text editor such as Notepad.

It's recommended you take a backup of the web.config file before you perform the next step.

In the web.config file, replace all occurrences of “

httpsTransport

” with “

httpTransport

” and then save the file.

The new SP1 web.config file contains binding entries for both 

httpTransport

 and 

httpsTransport 

that match the Binding name.  For example, there is an 

AutodiscoverBasicHttpBinding

 and an 

AutodiscoverBasicHttpsBinding

 now.

Finally open a command prompt windows and run “iisreset /noforce” so that the changes are applied.