Microsoft Introduces Professional Degree in Data Science

With the adaptation of IT becoming more Business Technology, Microsoft is nearing releasing a certification program for Data Science.

According to Microsoft: “Microsoft consulted Data Scientists and the companies that employ them to identify the requisite core skills. We then developed a curriculum to teach these functional and technical skills, combining highly rated online courses with hands-on labs, concluding in a final capstone project. Graduates earn a Microsoft Professional Degree in Data Science—a digitally sharable, résumé-worthy credential.”

It appears that this will require 10 Courses to be taken, to the tune of about $525 in total. It’s not a bad deal when considering this is the heavy focus of most higher level IT Professional’s now a days. Good On Microsoft for getting this program rolling. As of today you can sign up to be notified when it goes public via: https://academy.microsoft.com/en-US/professional-degree/data-science/

 

Integrating PowerBI with Office 365 Monitor

Integrating PowerBI with Office 365 Monitor

It will then prompt you to login. You need to use an account that has the rights to Reporting within Office 365.  Second to that, you also need to register for the Office 365 Monitoring Service via: https://www.office365mon.com/Signup/Status.

Initially I was unaware of this step. That being said, I was receiving the following error:

Verizon Wi-Fi Calling from an iPhone appears to not work

Verizon Wi-Fi Calling from an iPhone appears to not work

Originally it seemed after enabling WIFI calling it wasn't working, or appeared to work intermittently.  Being that I am located in two primary locations all of which that have Meraki networking equipment, I suspect that to be the issue.   I would notice it would work when tethered to an IPAD, or randomly so I didn't believe it to be a device issue.

Fostering Knowledge Management and ending "Mental Hoarding"

Fostering Knowledge Management and ending "Mental Hoarding"

I was recently discussing autonomy within Technology departments and it sparked a larger thought process for me.  During that conversation it came up how it seemed that specific people will retain project or company related knowledge vice documenting and sharing that knowledge within that department.

Expiring Links - OneDrive for Business

In a welcome surprise today notice that the "Get a Link, or sharing functionality within OneDrive for business has been updated.  Previously there was a drop down list that was View Link, or Edit Link.  If you wanted to require a Sign-In there was a check box you could enable.

Now those options are

  • View Link - Tenant ID 
  • Edit Link - Tenant ID
  • View Link - No Sign-In Required
  • Edit Link - No Sign-In Required

Get a Link - Updated Choices December 2015

You can still share a link outside your organization, but must use the "Invite People" option now vice just requiring sign in.

Another aspect, and this is the one that was long overdue and perfect, is the ability to have the links expire.  You can now have the link "expire".  Now this is good for public links, where you need to share information.  That happens to be what I was doing when I noticed this.  I needed to share a video of an issue with a product to the support team.  In this case, I can upload the file, and give a 30 day link that I know will not be available in the future.  

Set links to expire - Updated December 2015

The only thing I wish this also applied to, would be the Invite a user portion as well.  There are times when you could be collaborating on a project and you want to share the file for a day or a month and it would be nice for that expire as well.  Here is to hoping that's what next.

Finding Files with Long File Names

Recently have been trying to organize a large Data Archiving project and during the evaluation of the process and discovery of the files to be archived we found that there was a lot of folders that couldn't be moved as a result of the path and\or file names exceeding the 260 character limit.  Having 9+ TB of data, you can imagine how many files there are an how time consuming it would be to find all those files.  Second to that if you were using the traditional file move within windows it would get stuck on the path being to long and generate an error:

As a result, the transfer would get stuck. Reminding us the fact that moving terabytes of data this could be cumbersome, not to mention the best way to mass archive folders.  Having said that if the end users are cleaning up their own data, it's a problem.

In order to prepare the data for move, it's best to try and get a list of these files ahead of time in order to decide how to handle the files.  Some may not be needed, some may require you to rename the files. 

In order to do that, I turned to powershell to provide us a list of where those files or folders are located.  However, a straight Get-ChildItem -Path "<PATH>" returns the following error:

Get-ChildItem : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.

This is where Exporting the results to CSV helped.  But before I could do that I need a good powershell function to work with directories.  Lucky for me on TechNet there was the Get-FolderItem powershell script to assist.  Once I had that download I put it in my "Scripts" Folder.  You could put it wherever you want, however you need to reference it in your script, thus remember where you placed it.   Once I had that set, I created the following script:

############################
### Find Long File Names ###
############################

cd <PATH TO Get-FolderItem>
. ./Get-FolderItem.ps1
cd \

# Set path to root folder you want to find long file names and the output folder path

$path = "<path to search base>"
$outputfolderpath = "<Output Path for CSVs>"

$GetRootFolder = Get-ChildItem -path $path | ?{ $_.PSIsContainer }
foreach ($folder in $GetRootFolder)
{
$outputfile = ($outputfolderpath + $folder.Name + '_LongPaths.csv')
(Get-FolderItem -Path $folder.FullName | Where-Object {$_.FullPathLength -ge "248"})| Export-CSV "$outputfile"
}

You will see in the above script I set the search path, i.e. C:\LongFileDirectory.  This is the path that contains either the folders or files that you suspect that exceed the limit, or in my case the root path to where the archived folders\files were located.

The Output directory for me was a local temp file.  Essentially it's going to create a file for each sub folder that is below the root.  What you will notice is that any file that is created that has a 0KB file size means there were no "problems" in that sub-folder. 

In that file it will show a list of all the files or folders with long paths.

You will see the FullName column contains the full path, in this case I used the PSIsContainer in my script so it's returning only files and not just the folders.  However, you also see the ParentFolder Column which will show you the paths.

It's important to note here, there are options on what you can do from this point forward.  In our case after looking at the ParentFolder paths we could tell that a lot of this data was duplicated information or image files that we no longer needed to retain.  Most of them came from zip files that were download from our clients that were reference files years prior.  

Point being is at this juncture you could use the "FullName" column to target the files for renaming or deletion, or you could target the ParentFolder for mass deletion.  In our case that's exactly what we did.  We used Powershell in order to Import the CSV, and used the Get-Unique function to return just the list of problem folders.  In our case it took 100,000+ files down to less then 200 specific problem folders that we choose to delete.  That script is below if you wanted to use that as well.

$filelistpath = "<PATH TO CSV Filies Created Above>"
Get-ChildItem $filelistpath\*.* -Include *.csv
$folderlistfiles = Get-ChildItem $filelistpath\*.* -Include *.csv
foreach ($filepathlist in $folderlistfiles) 
{

$fullpath = $filelistpath + $filepathlist.Name

$longpath = Import-Csv $fullpath
$unique = $longpath | Get-Unique

foreach ($fubar in $unique)
{

$delpath = $fubar.ParentFolder
robocopy c:\Fullerton\Blank $delpath.Trimend('\') /mir 

}

}

802.11ac 80Mhz intermittent Connection issues

Over the last little bit, I have had random issues with latency and throughput in specific areas both at work and at home with wireless. Mostly since I'm on the move it's been via my iPhone, but could also be noticed on iPads, the Surface and all other devices. Only difference being is to what level there was degradation.

Initially when I'm relatively close to the AP, lets say 100ft, minor obstructions I had very little issue what so ever. However, if there was the same distance with obstructions, or slightly longer distances all the devices would show a good signal strength. Percentage wise, around 35%, and dBm wise about 14-18 dBm. Now I know those are on the weaker side of the spectrum, but they are still usable. From a speed standpoint during initially installation I would still see 100Mbps + bi-directionally even with that signal strength. My latency would be about 7-10ms vice the normal 2ms at that strength but more than manageable.

As for my set up. It's identical between work and home, only difference being quantity and internet speed. Both networks are fronted by a Merkai MX-100, Meraki Switches, and MR-34 APs. That being said it's quality equipment that has always performed well for me. After some googling, more so about the iPhone and 802.11ac issues, I found a lot of people complaining about the 80Mhz configuration and the iPhone. That being said I wasn't seeing the same issues they were. For the most part they were having speed issues regardless of where they were in relationship to the AP. Mine were specific to the outer ring of the range. None the less, I figured I'd give playing with my settings to see what occurred.

Before I get into the settings, I will say that on the Meraki's, specifically when you go to my.Meraki.com when connected to an AP when you are set at 20Mhz = top speed of 400Mbps, 40Mhz 600Mbps and 80Mhz of 800Mbps. This maybe important to some depending on their internet speeds, and what they may be doing across the LAN.

To alter the settings on the Meraki, you need to go to your dashboard, Wireless, Radio Settings to alter the settings for your APs in the network.

onenote-562bc44eb37855.76264246.png

As I mentioned earlier, I had been googling the issue specifically with the iPhone so I used some of the details there as my baseline. I started close to my AP and ran a speed test both on my iPhone and Surface and set a baseline of what I expected my AP Performance to be. I then went to the "problem locations" and recorded the bandwidth when things were performing optimally. Was getting about 125Mbps bi-directionally there as well.

I really didn't want to go down to 20Mhz as I had read on some of the articles with regards to the iPhone issues. However, the Meraki doesn't offer an "Auto" setting like some other 802.11ac routers do. I then tested at both 20Mhz, 40Mhz to see how the performance was. The performance at the 40Mhz range was equal to my previous tests. Even when running multiple simultaneous speed tests all the devices performed the same, and on par with what I was seeing at the 80Mhz range.

onenote-562bc450313c33.71846679.png

Thus my thought is if you are running a high concentration of Apple Devices, go ahead and drop your 5Ghz band to 40Mhz and call it a day. Second to that put in a feature request with Meraki (or your vendor) to support the "Auto" functionality.

Office 365 - Distribution Groups Not Syncing

O365.png

Using a product called Crossware, we are able to automatically append signatures to e-mail messages at the transport level.  Initially we were having issues with a Signature being applied to a group of users.  We verified that the user was a member of the correct group, tested the user within the Crossware web application and all looked well.  We then tested using Microsofts Graph API (https://graphexplorer.cloudapp.net) in order to verify that the user was a member of the appropriate groups.  It's there we noticed that the user was not showing in Office 365 as part of any of the Distribution Groups that they should be a member of that were created for this signature application.  However the Groups did show up in the Office 365 console. At this point I took to powershell and ran a Get-DistributionGroup command against our Office 365 Tenant. It was here that I noticed that the groups were not listed:

img_55facacbaf428.png

It was at this juncture I noticed while looking at the Distribution Groups in Active Directory, that they didn't have the  mail or DisplayName attributes filled out.  Turns out that when you create a Distribution Group on premise where you don't have an Exchange Environment, those attributes are not created automatically.

If you go and add the DisplayName attribute and assign it a mail address, the item will now sync with Exchange Online vs. Just office 365.

img_55fad07905f22.png

OneDrive For Business - Rename or Hide Icon in Windows Explorer

OneDrive4BusinessRemove.png

When you have OneDrive for Business installed in Windows 10, you get a nice OneDrive for Business Icon. For those of us who have remapped (relocated) the Windows Libraries to our end users OneDrive for Business accounts, this becomes a redundant icon. To remove this icon you need to go into the registry editor and navigate to:

[HKEY_CLASSES_ROOT\CLSID\{3BA2E6B1-A6A1-CCF6-942C-D370B14D842B}]

Alter theSystem.IsPinnedToNameSpaceTree to the Value of 0

If you just want to rename the icon, change

(Default) to the Value you wish the icon was named.


Office 365 vs. InTune Important difference

Office365_MDM_Builtin.jpg

This past week I had been attempting to use Office 365's MDM Solution so that I had selective wipe available. In the past I have always used a third party option for MDM, such as Airwatch, Meraki, Maas360. That being said, I thought it was time to attempt to use the built in function with in Office 365. However it's important to know there is a huge difference between Office 365 MDM vs. InTune. The important thing to know here is that the built-in MDM solution in Office 365 currently only supports the Outlook for Android\IOS apps, not the native e-mail client. I naturally assumed and had trouble finding where it distinguished this. After running into some issues I finally found documentation on where this was the case, but it wasn't easily available. If you want the Native apps to be managed via Office 365, you need to use Intune in order to do that.

A quick recap.... If you want Native Mobile App support, and other device types to be controlled you need to use Intune. If you just want to control the apps for the devices then you can use Office 365 built-in MDM.

IT transforming to BT – Part 2 – Business Technology

BusinessTechnology.jpg

In Part 1 we discussed the overall climate in IT shifting to BT, but how does that affect the IT Professional? Change is inevitable in technology, as is it for the staff so having to adapt is something most of the staff can accomplish.  Having said that, just because they are used to change doesn't always mean that in this shift that everyone will be successful or capable.  When you think of your team, and some of your best workers, there is a chance that you have a hard worker, but they are just good at repetitive tasks, or troubleshooting specific issues but their ability to understand the larger picture isn't quite there.  As a leader this should be be your focus to change, or help to develop.

As the IT shifts more to a business minded focus it's important to start to shift with it, to begin to split your time between furthering your Technical knowledge and furthering your understanding of the industry and business itself.  After all, to be able to come up with the best solutions, especially proactively, you need to be ahead of the power curve.  IT Professionals are used to thinking ahead, but they now also have to understand where the business and industries are going as well.  I remember reading an article a few years back in Wired (Unhappy at work? Be an Intrapreneur) and the first thing that struck me was the word Intrapreneur.  By definition it's "a manager within a company who promotes innovative product development and marketing", in lay-mans terms it's taking the situation you have and finding a way to develop something "new" for your current employer. At face value this is largely important for the worker to think about, and potentially do. The upside is that if you are able to develop that new idea for the company, it not only enhances the company, but it also elevates your worth within the company thus making it a "win win" situation. A secondary byproduct of the project is it usually enhances your skill set in some facet. It could give you a better understanding of the processes within the company which is needed to elevate from within, or the technology being implemented could also have a learning curve, thus giving you another skill-set notch on your resume. Either way there is nothing to be lost by trying to improve the company while learning something at the same juncture.

Enhancing your skill set here is key, and just not your technology skill set but your business skill set. As all companies have become more dependent on technology, or grow, those who were involved in the early stages of integrating the technology have far more important tasks to be dealing with. This can sometimes slow down the addition of technology to line of business when trying to asses the impacts by management. An imperative skill for tech pro's to learn, is how does the business work. If your company makes widgets, what's the process, why that process, have they explored other ways to make those widgets. As you are understanding how the business does what it does, you may see specific technology advances that could apply. For instance, if Person A had to walk the widget to Person B to get painted, coming up with the idea of implementing a conveyor belt could increase the production times. Don't stop at just applying what you already know as well. Start learning or researching technology in your companies industry to see what others are doing. Read about the complaints people have about your competitors and see if there is a process you can help streamline to increase incoming business. Take the time to understand the other departments and what they do. It isn't uncommon for the IT Department to be a smaller subset of the "Corporate Department" or Overhead Departments. That being said, why not look at some of the other administrative departments and see if there is something you can improve for your internal clients.

These are all examples of what you should be doing as Information Technology is evolving into Business Technology. Getting to understand the business you are in, and look for ways to use technology to increase the proficiency of your company.

Office 365 - Disable Clutter Feature Globally

O365.png

In a conversation today, I was asked about Clutter, what it was, why it was there... The short answer it, it is your inbox learning from your patterns on messages you are likely to ignore. To me, this is pretty similar to the Junk E-Mail folder. I know technically they are completely handled differently, however an End User ask this question

Why do I have to check mail in three spots, Inbox, Junk, and now Clutter?

That being said, it maybe useful to turn off the Clutter Feature accross the board. Anyone who has been in IT for any length of time knows how many times the phrase "Did you check your Junk Folder" is said. Instead of adding "Did you check the Clutter Folder" we could just turn it off.

The hard part is, there is no global function to disable Clutter, it is considered a "Per user" feature. This is where we need to turn to powershell. Keep in mind you need to connect your powershell to Office 365\Exchange Online. Instructions for that could be found here.

For this script, I merely created a variable and added the users, then used that variable in the Set-Clutter CMDlet. I was forced down this path as the Set-Clutter CMDLet doesn't accept a pipe to it.

#########################################
#                                       #
#        Connect to Office 365          #
#                                       #
#########################################
$msolcred = get-credential
connect-msolservice -credential $msolcred
#########################################
#                                       #
#            Exchange Online            #
#                                       #
#########################################
$UserCredential = $msolcred
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/?proxymethod=rps -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session
#########################################
#                                       #
#        Change Clutter on\off          #
#                                       #
#########################################

# Disable Clutter processing for a mailbox
Set-Clutter -Identity MAILBOXID -Enable $False

# Check that it's done
Get-Clutter -Identity MAILBOXID | Format-List

# Globally
$users = Get-MsolUser
$users | foreach { Set-Clutter -Identity $_.UserPrincipalName -Enable $False}

References: Clutter Notifications in Outlook - Microsoft Connect to Exchange Online Remote Powershell - Microsoft

Cloud PBX with PSTN for Office 365 - Skype for Business Preview

CloudPBX1.png

Using the Cloud PBX with PSTN as part of the of Skype for Business Preview has been great the first 48 hours of the program.  As many know, this is something something I have been wanting for a long time.  That being said there are definitely some pros and cons in the first wave of the preview, some of which such as E911 are to be expected and others like Auto Attendant are disappointing. Starting with the bad since I know for sure at least one U.S. In a future build.  Voicemail, currently this is unavailable which can be frustrating.  One of the suggestions for being a preview tester is to use this as your daily phone.  As a result I would like to transfer my current PBX desk phone to my Skyoe number but with no voicemail that isn't an option unless I want to manage that every time I get up.  I am going to see if I could use simulring with the production PBX and get the timing right in order to leverage existing voicemail but still leverage Skype to take all the calls.

One of the most used features of the PBX is the IVR or Auto Attendent.  In this first wave release we do not have access to any of these functions.  I haven't been able to find anything that says of this will ever be available but in the Tech Preview slide show they mention it as not available in this release which allows me to believe it will be in later releases.   That function being missing basically only allows you to test basic call quality and functions.  Personally this is one of the features I wanted to look at from an evaluation perspective.

On to the good, everything else!   The call quality is what you would expect, great.  Just like your typical Skype or traditional PSTN, I didn't have any latency or crackling that you can sometimes get with VoiP calls.  Now I have made calls from a 100Mbps fiber, Comcast Home, and while using my phone as a 4G hotspot.   To me that is good news, that any solid connection can handle the call with the same level of quality.  The same level of quality could be said when making calls via the Lync 2013 iPhone client.  Wether on a Meraki Wireless AC AP or through Verizon's wireless network, the quality was the same.

What I have found that I really like about using Skype For Business in general is the portability, for instance taking the calls on any PC or device with the Skype for Business Application being installed.  During part of my testing I was able to test at multiple locations; home via a laptop, work via a desktop, on the road via both the laptop and the mobile client.  Throughout  all those tests the connections were solid, but more importantly, it was independent of the connection back at the main office,.  Where if this was an on premise solution would be dependent on the connection and equipment to which the server was attached too.

One of my overall positive points regarding the cloud in general, is that in most situations the IT Staff that is maintaining and caring for the cloud service is far greater then the internal staff.  Despite what any IT Pro may want to think of themselves and there knowledge the major cloud providers have hundreds and thousands of people at the ready in case of an outage.  That's not to say that those of us that are internal IT Staff to a business are not more then qualified, it is just meant to illustrate that there is power in numbers.   In this scenario a similar thing could be said, regarding the amount of bandwidth, and the quality of connections that Microsoft has in the Azure cloud is far greater then most if not all other business'.  Point being, is that there are multiple avenues to consider bandwidth for a company, what they need coming down, what they need going up, and the capacity of remote access at any given point.  Much like everything else there are usually sacrifices given to the "potential" situations vice the norm.   By having your PBX in the cloud, you are eliminating the impact of remote access on top of DR bandwidth needed because that would solely be put into formula for how many devices are at the location.  To me, this is much easier to justify and manage from a consideration standpoint.

I touched on another point that deserves some independent consideration, and that is the Disaster Recovery Aspect.  The fact that the PBX is hosted in the Microsoft Cloud, it takes away the concern of redundant systems in the companies DR plan.  When you consider what could be involved in disaster recovery regarding phones you have considerations of not only just the equipment but the copper, or bandwidth needs in the co-location for DR.  Most of the time those connections are just idle, and to have a comparable amount of bandwidth available in your DR site could be a quite expensive monthly cost for something that hopefully never gets used.  That being the case, when considering the cost fact of using Skype for Business Cloud PBX you also have to balance that against the in house solution, plus your Disaster Recovery costs for that same service.  Being an Enterprise Customer, as it's laid out now, there really isn't a decision there, the cloud is your friend.

In the first 48 hours, Microsoft has confirmed and validated my excitement for Skype for Business Cloud PBX.  I am anxious for the second wave of features to roll out so that I can confirm this is the solution I've been dreaming about!

Update!

I can confirm that if your PBX is set to forward "Original Caller ID" vice the Global Caller ID Skype will respect the incoming caller id from the original call.  Bonus!

Quick Setup - Skype for Business Preview - Cloud PBX with PSTN Calling

skype_for_business.jpg

July 1st Microsoft opened up the preview for Skype for Business which allows select customers who have an E4 License or Skype for Business Plan 2 to enroll and test out the following: Skype Meeting Broadcast: 10,000 person internet meeting, with integration into Bing Pulse as well as Yammer integration.

PSTN Conferencing: Allows you to create a Skype for Business meeting with a dial in number when your Skype resides in the cloud.

Cloud PBX with PSTN Calling:  This is the ability to make and receive traditional phone calls via Skype for Business without having to have a traditional on premise solution.

I was able to get into the preview program for the last two, Broadcast Meetings is not something that applies to my current situation.   I have been wanting for awhile for the Cloud PBX to come to fruition, as I find that in this day in age of the cloud, that being able to offload this functions is great from a management standpoint but even better when considering DR Solutions.  Telephony is usually the hardest to place in a Disaster Recovery situation as it requires you to maintain a secondary circuit, and to a certain extent, a secondary location that is available if needed.  For most small and medium size business the expense tends not to be outweigh the risk. In the case of Skype for Business PBX, it lives in the cloud and thus your disaster recovery location becomes anywhere you have a quality internet connection.  Considering SMB's that could be people working from home, or multiple smaller venues via computers.

Dial In Conferencing:

From a configuration standpoint, the process was pretty painless.  Upon acceptence into the preview program you receive an e-mail with some one time codes that you need to enter into your Office 365 portal.  This works just like adding an additional license of any of the Office 365 products.  Once you have enabled the PSTN Conferencing you will notice that your "Microsoft Bridge (preview)" will have a list of numbers from different regions.

SkypeforBusinesspreviewDialInConferencing

 

Once those numbers are available you can then go into the user properties, dial in conferencing and select the provider of Microsoft. Once you have selected Microsoft you can select from one of those numbers.

Skype for Business Dial in conference user properties

This becomes the number that is added to all of the users Skype for Meeting requests.  You will also notice that a Passcode is created for that user as well.  This is to allow for multiple users to use that dial in conferencing number.

Cloud PBX:

When the one time code mentioned above is applied to your account you will then see a "Skype Voice" option on the left hand side of the Skype for Business Dashboard.

img_55975ba705067

Upon Clicking this, you will see phone numbers (preview) and voice users (preview).  Under the phone numbers option, you will see a blue button that will allow you to add new numbers by region and area code.

When you want to add a phone number to a specific user, you can select the voice users (preview) tab and find their name.  If they do not have a phone number currently you can assign a number, if they do you can change\remove.

img_55975cef9d9ee

At this point the next time they login to the the Skype for Business Desktop Client, or Lync 2013 client on the iPhone they will see the dial pad, and have the ability to send and receive calls through Skype.

IT transforming to BT - Part I - Business Technology

BusinessTechnology.jpg

 

Over the last few years specific all it is becoming more apparent that the IT Field is changing. What!?!? That can't be the case. We are all used to technology changing quickly but it isn't the hardware or software that is changing the most any more, it's the expectations of the field.

IT has always been complicated and required people who love and appreciate the field but it's no longer just about how to upgrade to the next technology to embrace the newest features, it's becoming more about the business. One could argue it's always been about the business and it has, but generally the IT Departments try to stay in the shadows, cut costs where they can and most importantly just keep things running. It has always been easier that way. Let's face it, unless you work for a tech company IT is largely viewed as a necessary evil and/or overhead cost.

 

With millennials entering the work force, mobile devices, BYOD the shift has started from just keeping up the infrastructure and help desk support, to being involved in the business itself. What used to be, IT was similar in any business, has evolved to IT becoming BT (Business Technology). This change has brought IT to the table on participating in the corporate strategy. As a result this is allowing IT Departments to be ahead of the curve and work with the business to provide tools to enhance the business versus reactively having to find, fix or retool technology to fix a single problem. This is a welcome change that overall will allow IT to proactively assist in the line of business needs and will also be a cost saving measure.

 

In IT Management it’s not uncommon to find a subset of your user base who is doing one thing because it enhances their process, but is doing it either outside the policy controls of IT, or in direct violation of those controls. As bad as that sounds, in the modern era this is not such a bad thing. You wouldn't want to curb this enthusiasm for working smarter not harder, but at the same juncture need to make sure IT is in the fold, if for no other reason than security. This is where executive management embracing BT is the key. Once the executives realize that the IT Department could be used and staffed in such a way it shifts the culture away from the "evil geek dictatorship" into the customer service friendly professionals that most IT Departments would prefer to be. By bringing IT to the business table it helps provide a fresh perspective on the procedures that are currently being used in the business. That is not to say that any IT professional could run the business better than the other management staff, but IT people are natural born troubleshooters. As a result, we are always looking to streamline, or improve upon processes, thus just being present can sometimes lead to great ideas resulting in an increase in efficiency providing a lower bottom line. At the end of the day it would be fair to say that just as important as it is to IT Management to streamline costs, it's always a welcome sign when we can apply that outside of our field as well to provide a larger impact to the business.


Office 365 - Get-User WARNING, at least one source array could not be cast down to the destination array

Been one of those Monday Mornings today.  That being said, about 7-10 days ago there was a lot of tickets being added to the help desk regarding E-mail signatures not being updated. Being there were multiple reports, I took note to keep an eye on the issue.  The support team reached out to "Exclaimer" (which I'm not entirely sure I would use again for the record, check out Crosswares) and they said they are getting a lot of reports of an "aggregation failing" and the changes in looks up not functioning. After 72 hours the help desk was calling, and e-mailing daily for updates to the equivalent of a "all circuits are busy message".   Then finally 10 Days in the following message arrives:

We have identified the error you are seeing in aggregation, and it is caused by a corrupted object in your Office 365 environment. We are currently creating a PowerShell script which will highlight the object(s) which have corruption, which will allow you to identify the cause of this error. This could be a mailbox or group in Office 365, and is likely to be something that has been added or changed recently. Once we have the script I will send this over to you, it should be ready early this week.

Once you have been able to identify the corrupt object, you will need to remove and recreate it in Office 365.

You can verify that this is the case by running the following scripts in PowerShell. The first connects to Office 365, please use an admin login when prompted. The 2nd simply tries to pull the usernames. If you see the same error when running this script, it will confirm that the issue is with a corrupt object:

Import-PSSession (New-PSSession -ConfigurationName Microsoft.Exchange -Credential $null -ConnectionURI https://ps.outlook.com/powershell -Authentication Basic -AllowRedirection) -AllowClobber

Get-User You may be able to identify the problem account by using:

Get-User “”

This will fail for the corrupt object(s). Our script will check each object.

If you do have any further questions or queries, please let me know. I will be happy to help.

First off, those of us who use powershell notice an issue with the command that they sent over.  It will error when you attempt to run the command:

None the less, that was an easy adjustment to get connected to Office 365 and Exchange online.  Once connected I went ahead and ran the "Get-User" command they suggested.  This is when it returned the following error:

WARNING: An unexpected error has occurred and a Watson Dump is being generated: At least one element in the source array could not be cast down to the detestation array type.

Trying to narrow down which User or Object it was could be difficult, especially depending on the size of your environment.  In this case I just started running a Get-User Command limiting the the result size till I had a respectable window.

Get-User -ResultSize 200

After a few attempts I was able to narrow it down to User #60 in  my environment.  In this case I did the Get-User -ResultSize 59 and saw who the last user was because I noticed by default Get-User was in Alphabetical order. Once I realized I was in the "I" users I get did a:

Get-User -Filter {name -like 'I*'}

This gave me a list of the I Users and I noticed that the user that was #60 happen to be the account created when you create a new group in Outlook Web Access to have a shared team.  Given this was a test function currently, I removed the group via the Office 365 admin interface.   At this point I was able to run the Get-User command with out errors and the Exclaimer aggregation now worked solving the issue.

My frustration issues here stem from a few things; the first being Exclaimer.  As a SaaS provider your response time to find an issue like this needs to be quicker.  Had I known how you were accomplishing your product I could have found that in about 10 seconds, not 10 days.  Secondly is why is it failing with the new groups?  According to google there are a lot of people using the Get-User command and seeing this error.  Does this then mean there is an issue with this feature and a bug report should be placed?  The thing that leads me to that path is the fact my Get-User -Filter command actually returned the supposed "problem user" but it wouldn't return on the Get-User.

Microsoft Ignite - Cup half empty?

Ignite2015-e1435553400136.jpg

Despite this being the ignaugural year for Ignite, to me, it's another year of TechEd which I have usually enjoyed the last several years.   Typically this is where we get a glimpse of what is to come, maybe even some larger then life opportunities.  This year however seemed greatly different and to be honest, I'm not entirely sure why.

Initially I thought maybe because it was in my home metro area.  As nice as that may seem to some navigating, Chicago is a chore on a good day.  For instance, I'm in the suburbs, the first day the 40 mile drive took over 2 hours.  The second day I used public transportation.   That means an hour and 10 minute train ride, 7 block walk to the green line, finishing that up with a 10 minute walk to McCormick place.  All in less then 2 hours and not nearly as frustrating as the traffic.  Compare that to waking up at the hotel and a 5-10 minute walk, it's almost nicer to travel.  And don't get me started on the cabs... Just check Twitter on that front.  Oddly enough as crazy as they are us locals drive like that to stay alive so we are some what used to it. 

Commuting aside, it didn't feel much better while I was there either.  The lines always seemed longer then normal to me, and finding where you needed to be was just as difficult.  Nothing like waiting in line for 20+ minutes for food to then ask yourself what is this crap.   Normally the lunches have always been somewhat foreign, but only because they were suppose to represent the local cuisine.  The quality of the food was always pretty good.   Not this year, the food was horrible, and that assumed there was still some by the time you got to the pick up line.   Combine that with it not even being close to the normalcy of what is available in Chicago.   We have some great local only chains, and not one brought in for lunch, but there was fried pickles?!?

There was some good.   I thought the keynote was good.  I found it great to see Microsoft adapting to the industry instead of trying to bend us to their will.   For that I have to give a lot of credit.  The upcoming changes with Windows and Office 365 are great improvements.   

There was also a few good lectures outlining things such as Nano Server and powershell.  However those were from the usual suspects who are always great.   Compare those lectures to the "Deep Dive, ask the experts about Oulook" session and you can see where the contrast comes.  Here we had some brilliant minds but the only answer we heard the whole time; "it's on the road map, don't know when". 

That seemed to be the theme in any of the questions I had or heard during my time at Ignite. This became increasingly frustrating.  They show some great things, have some good new products but no answers.   Now I do understand if they start outlining time frames that it sets an expectation.  However we are all reasonable people and if it was laid out that we are hoping to have this then or here is the priorities so when you see this you know that is next.   Referring us to a road map that is online and doesn't mention 75% of what is in the discussion isn't exactly helpful.  

My wish, is at events like this to realize that no matter how great things are "going" to be, and no matter the hard work being put in, we are the boots on he ground.   In order for us to plan projects, keep the business at bay waiting for the great solutions from Microsoft coming down the pipe, we too need to be able to set expectations.  As good ole Jerry McGwire would say, Help me help you!