Enterprise

Fostering Knowledge Management and ending "Mental Hoarding"

Fostering Knowledge Management and ending "Mental Hoarding"

I was recently discussing autonomy within Technology departments and it sparked a larger thought process for me.  During that conversation it came up how it seemed that specific people will retain project or company related knowledge vice documenting and sharing that knowledge within that department.

Finding Files with Long File Names

Recently have been trying to organize a large Data Archiving project and during the evaluation of the process and discovery of the files to be archived we found that there was a lot of folders that couldn't be moved as a result of the path and\or file names exceeding the 260 character limit.  Having 9+ TB of data, you can imagine how many files there are an how time consuming it would be to find all those files.  Second to that if you were using the traditional file move within windows it would get stuck on the path being to long and generate an error:

As a result, the transfer would get stuck. Reminding us the fact that moving terabytes of data this could be cumbersome, not to mention the best way to mass archive folders.  Having said that if the end users are cleaning up their own data, it's a problem.

In order to prepare the data for move, it's best to try and get a list of these files ahead of time in order to decide how to handle the files.  Some may not be needed, some may require you to rename the files. 

In order to do that, I turned to powershell to provide us a list of where those files or folders are located.  However, a straight Get-ChildItem -Path "<PATH>" returns the following error:

Get-ChildItem : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.

This is where Exporting the results to CSV helped.  But before I could do that I need a good powershell function to work with directories.  Lucky for me on TechNet there was the Get-FolderItem powershell script to assist.  Once I had that download I put it in my "Scripts" Folder.  You could put it wherever you want, however you need to reference it in your script, thus remember where you placed it.   Once I had that set, I created the following script:

############################
### Find Long File Names ###
############################

cd <PATH TO Get-FolderItem>
. ./Get-FolderItem.ps1
cd \

# Set path to root folder you want to find long file names and the output folder path

$path = "<path to search base>"
$outputfolderpath = "<Output Path for CSVs>"

$GetRootFolder = Get-ChildItem -path $path | ?{ $_.PSIsContainer }
foreach ($folder in $GetRootFolder)
{
$outputfile = ($outputfolderpath + $folder.Name + '_LongPaths.csv')
(Get-FolderItem -Path $folder.FullName | Where-Object {$_.FullPathLength -ge "248"})| Export-CSV "$outputfile"
}

You will see in the above script I set the search path, i.e. C:\LongFileDirectory.  This is the path that contains either the folders or files that you suspect that exceed the limit, or in my case the root path to where the archived folders\files were located.

The Output directory for me was a local temp file.  Essentially it's going to create a file for each sub folder that is below the root.  What you will notice is that any file that is created that has a 0KB file size means there were no "problems" in that sub-folder. 

In that file it will show a list of all the files or folders with long paths.

You will see the FullName column contains the full path, in this case I used the PSIsContainer in my script so it's returning only files and not just the folders.  However, you also see the ParentFolder Column which will show you the paths.

It's important to note here, there are options on what you can do from this point forward.  In our case after looking at the ParentFolder paths we could tell that a lot of this data was duplicated information or image files that we no longer needed to retain.  Most of them came from zip files that were download from our clients that were reference files years prior.  

Point being is at this juncture you could use the "FullName" column to target the files for renaming or deletion, or you could target the ParentFolder for mass deletion.  In our case that's exactly what we did.  We used Powershell in order to Import the CSV, and used the Get-Unique function to return just the list of problem folders.  In our case it took 100,000+ files down to less then 200 specific problem folders that we choose to delete.  That script is below if you wanted to use that as well.

$filelistpath = "<PATH TO CSV Filies Created Above>"
Get-ChildItem $filelistpath\*.* -Include *.csv
$folderlistfiles = Get-ChildItem $filelistpath\*.* -Include *.csv
foreach ($filepathlist in $folderlistfiles) 
{

$fullpath = $filelistpath + $filepathlist.Name

$longpath = Import-Csv $fullpath
$unique = $longpath | Get-Unique

foreach ($fubar in $unique)
{

$delpath = $fubar.ParentFolder
robocopy c:\Fullerton\Blank $delpath.Trimend('\') /mir 

}

}

IT transforming to BT – Part 2 – Business Technology

BusinessTechnology.jpg

In Part 1 we discussed the overall climate in IT shifting to BT, but how does that affect the IT Professional? Change is inevitable in technology, as is it for the staff so having to adapt is something most of the staff can accomplish.  Having said that, just because they are used to change doesn't always mean that in this shift that everyone will be successful or capable.  When you think of your team, and some of your best workers, there is a chance that you have a hard worker, but they are just good at repetitive tasks, or troubleshooting specific issues but their ability to understand the larger picture isn't quite there.  As a leader this should be be your focus to change, or help to develop.

As the IT shifts more to a business minded focus it's important to start to shift with it, to begin to split your time between furthering your Technical knowledge and furthering your understanding of the industry and business itself.  After all, to be able to come up with the best solutions, especially proactively, you need to be ahead of the power curve.  IT Professionals are used to thinking ahead, but they now also have to understand where the business and industries are going as well.  I remember reading an article a few years back in Wired (Unhappy at work? Be an Intrapreneur) and the first thing that struck me was the word Intrapreneur.  By definition it's "a manager within a company who promotes innovative product development and marketing", in lay-mans terms it's taking the situation you have and finding a way to develop something "new" for your current employer. At face value this is largely important for the worker to think about, and potentially do. The upside is that if you are able to develop that new idea for the company, it not only enhances the company, but it also elevates your worth within the company thus making it a "win win" situation. A secondary byproduct of the project is it usually enhances your skill set in some facet. It could give you a better understanding of the processes within the company which is needed to elevate from within, or the technology being implemented could also have a learning curve, thus giving you another skill-set notch on your resume. Either way there is nothing to be lost by trying to improve the company while learning something at the same juncture.

Enhancing your skill set here is key, and just not your technology skill set but your business skill set. As all companies have become more dependent on technology, or grow, those who were involved in the early stages of integrating the technology have far more important tasks to be dealing with. This can sometimes slow down the addition of technology to line of business when trying to asses the impacts by management. An imperative skill for tech pro's to learn, is how does the business work. If your company makes widgets, what's the process, why that process, have they explored other ways to make those widgets. As you are understanding how the business does what it does, you may see specific technology advances that could apply. For instance, if Person A had to walk the widget to Person B to get painted, coming up with the idea of implementing a conveyor belt could increase the production times. Don't stop at just applying what you already know as well. Start learning or researching technology in your companies industry to see what others are doing. Read about the complaints people have about your competitors and see if there is a process you can help streamline to increase incoming business. Take the time to understand the other departments and what they do. It isn't uncommon for the IT Department to be a smaller subset of the "Corporate Department" or Overhead Departments. That being said, why not look at some of the other administrative departments and see if there is something you can improve for your internal clients.

These are all examples of what you should be doing as Information Technology is evolving into Business Technology. Getting to understand the business you are in, and look for ways to use technology to increase the proficiency of your company.

IT transforming to BT - Part I - Business Technology

BusinessTechnology.jpg

 

Over the last few years specific all it is becoming more apparent that the IT Field is changing. What!?!? That can't be the case. We are all used to technology changing quickly but it isn't the hardware or software that is changing the most any more, it's the expectations of the field.

IT has always been complicated and required people who love and appreciate the field but it's no longer just about how to upgrade to the next technology to embrace the newest features, it's becoming more about the business. One could argue it's always been about the business and it has, but generally the IT Departments try to stay in the shadows, cut costs where they can and most importantly just keep things running. It has always been easier that way. Let's face it, unless you work for a tech company IT is largely viewed as a necessary evil and/or overhead cost.

 

With millennials entering the work force, mobile devices, BYOD the shift has started from just keeping up the infrastructure and help desk support, to being involved in the business itself. What used to be, IT was similar in any business, has evolved to IT becoming BT (Business Technology). This change has brought IT to the table on participating in the corporate strategy. As a result this is allowing IT Departments to be ahead of the curve and work with the business to provide tools to enhance the business versus reactively having to find, fix or retool technology to fix a single problem. This is a welcome change that overall will allow IT to proactively assist in the line of business needs and will also be a cost saving measure.

 

In IT Management it’s not uncommon to find a subset of your user base who is doing one thing because it enhances their process, but is doing it either outside the policy controls of IT, or in direct violation of those controls. As bad as that sounds, in the modern era this is not such a bad thing. You wouldn't want to curb this enthusiasm for working smarter not harder, but at the same juncture need to make sure IT is in the fold, if for no other reason than security. This is where executive management embracing BT is the key. Once the executives realize that the IT Department could be used and staffed in such a way it shifts the culture away from the "evil geek dictatorship" into the customer service friendly professionals that most IT Departments would prefer to be. By bringing IT to the business table it helps provide a fresh perspective on the procedures that are currently being used in the business. That is not to say that any IT professional could run the business better than the other management staff, but IT people are natural born troubleshooters. As a result, we are always looking to streamline, or improve upon processes, thus just being present can sometimes lead to great ideas resulting in an increase in efficiency providing a lower bottom line. At the end of the day it would be fair to say that just as important as it is to IT Management to streamline costs, it's always a welcome sign when we can apply that outside of our field as well to provide a larger impact to the business.


Office 365 - Get-User WARNING, at least one source array could not be cast down to the destination array

Been one of those Monday Mornings today.  That being said, about 7-10 days ago there was a lot of tickets being added to the help desk regarding E-mail signatures not being updated. Being there were multiple reports, I took note to keep an eye on the issue.  The support team reached out to "Exclaimer" (which I'm not entirely sure I would use again for the record, check out Crosswares) and they said they are getting a lot of reports of an "aggregation failing" and the changes in looks up not functioning. After 72 hours the help desk was calling, and e-mailing daily for updates to the equivalent of a "all circuits are busy message".   Then finally 10 Days in the following message arrives:

We have identified the error you are seeing in aggregation, and it is caused by a corrupted object in your Office 365 environment. We are currently creating a PowerShell script which will highlight the object(s) which have corruption, which will allow you to identify the cause of this error. This could be a mailbox or group in Office 365, and is likely to be something that has been added or changed recently. Once we have the script I will send this over to you, it should be ready early this week.

Once you have been able to identify the corrupt object, you will need to remove and recreate it in Office 365.

You can verify that this is the case by running the following scripts in PowerShell. The first connects to Office 365, please use an admin login when prompted. The 2nd simply tries to pull the usernames. If you see the same error when running this script, it will confirm that the issue is with a corrupt object:

Import-PSSession (New-PSSession -ConfigurationName Microsoft.Exchange -Credential $null -ConnectionURI https://ps.outlook.com/powershell -Authentication Basic -AllowRedirection) -AllowClobber

Get-User You may be able to identify the problem account by using:

Get-User “”

This will fail for the corrupt object(s). Our script will check each object.

If you do have any further questions or queries, please let me know. I will be happy to help.

First off, those of us who use powershell notice an issue with the command that they sent over.  It will error when you attempt to run the command:

None the less, that was an easy adjustment to get connected to Office 365 and Exchange online.  Once connected I went ahead and ran the "Get-User" command they suggested.  This is when it returned the following error:

WARNING: An unexpected error has occurred and a Watson Dump is being generated: At least one element in the source array could not be cast down to the detestation array type.

Trying to narrow down which User or Object it was could be difficult, especially depending on the size of your environment.  In this case I just started running a Get-User Command limiting the the result size till I had a respectable window.

Get-User -ResultSize 200

After a few attempts I was able to narrow it down to User #60 in  my environment.  In this case I did the Get-User -ResultSize 59 and saw who the last user was because I noticed by default Get-User was in Alphabetical order. Once I realized I was in the "I" users I get did a:

Get-User -Filter {name -like 'I*'}

This gave me a list of the I Users and I noticed that the user that was #60 happen to be the account created when you create a new group in Outlook Web Access to have a shared team.  Given this was a test function currently, I removed the group via the Office 365 admin interface.   At this point I was able to run the Get-User command with out errors and the Exclaimer aggregation now worked solving the issue.

My frustration issues here stem from a few things; the first being Exclaimer.  As a SaaS provider your response time to find an issue like this needs to be quicker.  Had I known how you were accomplishing your product I could have found that in about 10 seconds, not 10 days.  Secondly is why is it failing with the new groups?  According to google there are a lot of people using the Get-User command and seeing this error.  Does this then mean there is an issue with this feature and a bug report should be placed?  The thing that leads me to that path is the fact my Get-User -Filter command actually returned the supposed "problem user" but it wouldn't return on the Get-User.

Microsoft Ignite - Cup half empty?

Ignite2015-e1435553400136.jpg

Despite this being the ignaugural year for Ignite, to me, it's another year of TechEd which I have usually enjoyed the last several years.   Typically this is where we get a glimpse of what is to come, maybe even some larger then life opportunities.  This year however seemed greatly different and to be honest, I'm not entirely sure why.

Initially I thought maybe because it was in my home metro area.  As nice as that may seem to some navigating, Chicago is a chore on a good day.  For instance, I'm in the suburbs, the first day the 40 mile drive took over 2 hours.  The second day I used public transportation.   That means an hour and 10 minute train ride, 7 block walk to the green line, finishing that up with a 10 minute walk to McCormick place.  All in less then 2 hours and not nearly as frustrating as the traffic.  Compare that to waking up at the hotel and a 5-10 minute walk, it's almost nicer to travel.  And don't get me started on the cabs... Just check Twitter on that front.  Oddly enough as crazy as they are us locals drive like that to stay alive so we are some what used to it. 

Commuting aside, it didn't feel much better while I was there either.  The lines always seemed longer then normal to me, and finding where you needed to be was just as difficult.  Nothing like waiting in line for 20+ minutes for food to then ask yourself what is this crap.   Normally the lunches have always been somewhat foreign, but only because they were suppose to represent the local cuisine.  The quality of the food was always pretty good.   Not this year, the food was horrible, and that assumed there was still some by the time you got to the pick up line.   Combine that with it not even being close to the normalcy of what is available in Chicago.   We have some great local only chains, and not one brought in for lunch, but there was fried pickles?!?

There was some good.   I thought the keynote was good.  I found it great to see Microsoft adapting to the industry instead of trying to bend us to their will.   For that I have to give a lot of credit.  The upcoming changes with Windows and Office 365 are great improvements.   

There was also a few good lectures outlining things such as Nano Server and powershell.  However those were from the usual suspects who are always great.   Compare those lectures to the "Deep Dive, ask the experts about Oulook" session and you can see where the contrast comes.  Here we had some brilliant minds but the only answer we heard the whole time; "it's on the road map, don't know when". 

That seemed to be the theme in any of the questions I had or heard during my time at Ignite. This became increasingly frustrating.  They show some great things, have some good new products but no answers.   Now I do understand if they start outlining time frames that it sets an expectation.  However we are all reasonable people and if it was laid out that we are hoping to have this then or here is the priorities so when you see this you know that is next.   Referring us to a road map that is online and doesn't mention 75% of what is in the discussion isn't exactly helpful.  

My wish, is at events like this to realize that no matter how great things are "going" to be, and no matter the hard work being put in, we are the boots on he ground.   In order for us to plan projects, keep the business at bay waiting for the great solutions from Microsoft coming down the pipe, we too need to be able to set expectations.  As good ole Jerry McGwire would say, Help me help you! 

Teamwork in Business "overhead" departments.

teamwork.jpg

Recently, teamwork spanning across departments has been at the forefront as a result of a project that has ramifications across all of the "Shared Services" departments.  By shared services, we are talking about your "overhead" departments in most mainstream business.  Departments such as Finance, Legal, IT, Marketing, Business Development, Admin Services.  Traditionally unless in a specific industry most of these departments are not revenue generating departments but considered an overhead expense of doing  business.  What each of these departments can do in order to curb spending is a whole other discussion, however its' important to note that just because they are overhead, that they should all be able to justify their value and bring cost consciousness to their decisions. Depending on the size of the business, the teamwork could vary at different levels in the organizational chart however there is a certain amount of teamwork that needs to exist through these departments.  As an example, the IT department couldn't institute any new policies without having the language cleared through Legal, and replaced in corporate documentation by HR.  A basic situation such as this also can illustrate where issues regarding teamwork could arise.  Legal and HR may not see the value in the updated policy, nor want to have to add the task to their workload.  Having a leader that can illustrate the value of breaking down the department "silo" mentality is instrumental in providing a higher level, consistent and efficient process for the end client.  In this case as an overhead department the end client is the employees of the company.  It is often lost on the Shared Service departments that the revenue generating employees are their client, many times they are thought of as peers.  Its important to keep the mindset, if our clients were external, would we treat them this way or would we provide a different effort?

The goal is to have the overhead departments working in unison, by being productive with streamlined efficiency in order to support the line of business activities of the company.  Keeping in mind, that without the LOB Employees, there is no business for the overhead departments to support.  In that case the way the operations should flow is all of the staff between departments to realize they are a team.  An example process that could illustrate how that could work, is the intake of a new employee.  In theory one of the LOB departments puts in the approval for a new hire to the HR Department.  It would then be up to HR to notify the other departments on the needs of the impending new hire.  Example being, IT for user setup and equipment, Finance Department for Payroll and Expenses, Marketing for Business cards.  If that process isn't instituted or followed through correctly by the "Team" it gives a poor impression on to the new hire on their first day about the company, as well as hurts overall productivity an efficiency.

To usher in teamwork there are a few things that can help:

  • Communication:  It is imperative that all these different departments are communicating regularly.   There are always things that could arise from discussion that someone from a neighboring department maybe able to help with.  When those situations arise you can assign a "team" to a project vs. an individual.
  • Rewards: In some cases you could provide a reward for teamwork.
  • Accountability:  Everyone should be held accountable for their role, and\or projects.  That also means that regardless of who, both the positive and negative reinforcements should be applied.
  • Availability:  Management should be available to communicate any "direction" or conflict resolutions that may arise.

Lastly, its crucial to know that it teamwork can't always be built over night and requires time.  Be patient and constantly look at the members of the team in order make sure their needs are being made.

 

Potential dream PBX - Skype for Business

SkypeLync.jpg

[one_half last="no" spacing="yes" center_content="no" hide_on_mobile="no" background_color="" background_image="" background_repeat="no-repeat" background_position="left top" border_size="0px" border_color="" border_style="solid" padding="" margin_top="" margin_bottom="" animation_type="" animation_direction="" animation_speed="0.1" class="" id=""][fusion_text]Skype Lync merger[/fusion_text][/one_half][one_fourth last="no" spacing="yes" center_content="no" hide_on_mobile="no" background_color="" background_image="" background_repeat="no-repeat" background_position="left top" border_size="0px" border_color="" border_style="" padding="" margin_top="" margin_bottom="" animation_type="" animation_direction="" animation_speed="0.1" class="" id=""][/one_fourth][one_fourth last="yes" spacing="yes" center_content="no" hide_on_mobile="no" background_color="" background_image="" background_repeat="no-repeat" background_position="left top" border_size="0px" border_color="" border_style="" padding="" margin_top="" margin_bottom="" animation_type="" animation_direction="" animation_speed="0.1" class="" id=""][/one_fourth][fusion_text]This week Office Mechanics gave a demo for the upcoming Skype for Business which has been in preview since the Office 2016 preview became available.  Initially I was just excited for the rebranding simply because I knew it would bring a tighter integration between Lync and Skype, mainly video chatting between streams.  Aside from that, I selfishly despised the fact that there were two accounts needed in the enterprise, one for your Office 365 account and Lync, then a second tied to an MS ID in order to use Skype.  Having Lync rebranded to Skype for business eliminates both of those. Secondly in my dealings with multiple PBX vendors over the year I always wanted to see an easier connection between Lync and the PBX.  All though there were PBX vendors that allowed integration, either natively or with third party hardware, the configuration always seemed to provide some complexity that was difficult to feel secure with.  Additionally there always seemed to be some sort of trade off with functionality on either side.  From what has been gathered or mentioned so far, this should resolve many of those issues.

The last thing I hope to hear more about is desktop phones.  Natively we can't just remove physical phones from the desktop, I'm interested to see if it's going to stick with the Lync PBX integration model, or if there is going to be easier configuration for any SIP compatible desktop phone.[/fusion_text][fusion_text]

Update!

So it appears the dream has come true.  It's been hard to find out more information regarding the where, when who or what, however it is called Office 365 Skype for Business with PTSN.

It's my understanding that those with an E-4 license will be able to use this service.  That being said, not sure who the providers are going to be for the SIP Trunking. My Instinct is it's going to be those that are part of the Azure Express route.

Skype for Business is here—and this is only the beginning[/fusion_text][separator style_type="none" top_margin="" bottom_margin="" sep_color="" border_size="" icon="" icon_circle="" icon_circle_color="" width="" alignment="" class="" id=""][fusion_text]

Update

Microsoft Releases Preview program: http://nooch.co/1Itbcg7[/fusion_text]

The Cloud and evolving for IT Professionals

ITCloud.jpg

In business, many feel uncertain with moving resources and data to the cloud. The primary questions always become the security, and data control. Albeit they are valuable concerns they should be redirected to the type of data and the partnered provider. The suggestion is based off the fact that not all providers are equal, for instance think of trusting an enterprise like Microsoft vs. a local small business IT firm. Both have pros and cons but there is something to be said not only reguarding the scale of the environment, knowledge of the engineers but the inverse, having the ability to hold someone accountable. Potentially switching to cloud based services can also change the dynamic of the IT staff within the business. Just because services have transitioned to the cloud doesn't eliminate the need for internal IT Staff. In my opinion what it does is drastically change that dynamic, for instance putting e-mail in Exchange Online can eliminate or reduce the need for an Exchange Server Administrator in a fully hosted solution. That doesn't mean that all your internal Exchange knowledge isn't needed but it's changed. It turns more from a systems or infrastructure management role to an Exchange management role. There are still tasks that need to be performed, such as user creation and support, but the focus becomes more management. Think of it more as a management role in the sense that now it's about setting corporate policy on retention, mailbox size, and other business driven requirements. Essentially, and even though its still technical,the theory and principal knowledge is still required from your staff.

Overall these are some of the things that the business and IT Management need to consider. Does it make sense for the business, how does the change affect staffing, is the internal knowledge there to support the change.

As an IT Professional it's something we need to consider as well, how do we continue to adapt? I like to think there are two types of IT professionals, those who are the jack of all trades, vs those that are product specific. Having said that both need to adapt and start to become a bit broader in breath.

For instance, the Exchange Administrator referenced above could easily transition into a "Communications Management" type roll.  Meaning instead of working on the nuts and bolts of Exchange they could understand how to manage Internet cloud based communications and options such as "Exchange Online, Skype for Business".  They could transition into managing those rolls for the company and leveraging the cloud to make those possible.

At the end of the day, as an IT Professional our industry is changing and depending on your industry you need to evolve as well.  If you aren't working for a cloud provider of services, it would be beneficial to broaden  your horizons and learning about the cloud.

Exchange 2013 Migration via Powershell script based upon search.

ems.jpg

ExchangeMigrationWeb.ps1Overview:

During a migration from 2010 to 2013 we were working on changing some of our e-mail retention policies.  We had implemented journaling through a Barracuda Message Archiver to retain our messages per company policy. Second to that, we also wanted to migrate our e-mail storage from our existing mentality of just letting people manage an unlimited "pot" of e-mail. This isn't very cost effective for one, second to that it doesn't make for an Exchange Environment that is easy to manage and project future costs.

Because of this we were going to finally put into place e-mail box quotas to force people to clean up their mailbox.  We already had retention policies in place, however our average mailbox size was still well over 2GB.  That being the case we decided to set a max size of 2GB in order to allow for the future projection of growth, and keep a relatively static cost regarding our high end storage that is hosting our DAG.

The first issue we ran into (Other then how to deal with lowering those over 2GB) was how to migrate forward while at the same point dealing with those boxes that were larger.   Even though we could look into exchange and get a list of all the mailboxes that are currently below the 2GB quota, to have to parse through the Migration Job Wizard and manually select all those users would be tedious.  So... a script is in order to handle this for us.

The "how":

Well even as great as Exchange is, it doesn't make it easy to accomplish this.  The "TotalItemSize" property that contains the full mailbox size is stored within the Get-MailboxStatistics CMDlet.  However the New-Migration, or New-MigrationBatch CMDlets require an e-mail address in order to process a migration, and that is NOT stored in the Get-MailboxStatistics CMDlet.  There are several "commonalities" between the various CMDlets, such as GUID, Display Name and so forth, however we decided to use DisplayName from Get-Mailbox.

Essentially what we did was run Get-MailboxStatistics with a filter based upon the TotalItemSize being less then 1.5GB and not already existing in the new databases.  We then ran the Get-Mailbox Command to return all mailbox DisplayNames, and compared the two files in order to build a text file that could then be ran to return all of the "PrimarySMTPAddress" from the Get-Mailbox command to have the correct information needed to do the migration batch file.

Below is a snippet of that code.  You will also notice that there was some triming and parsing of the file in order to translate from the output of the Get-MailboxStatistics to the format needed to run the loop to pull the e-mail addresses.

###     SET YOUR VARIABLES FOR THE SEARCH CRITERIA      ####
$ServerSearchVariable="*ex2013*"
$TotalItemSizeVariable="100MB"

###     SET YOUR VARIABLES FOR THE COMPARE and IMPORT      ####
$CompareFile="c:\temp\compare.txt"
$PrimarySMTP="C:\temp\PrimarySMTP.txt"
$MigrationEmails="C:\temp\MigrationEmails.txt"
###     Do the compare of MBStats based upon Total Item size set above and the server name variable
Write-Host -foregroundcolor Yellow "Running the compare to gather the list of users who will be part of this migration"
$MBStats=Get-Mailbox | Get-MailboxStatistics | Where-Object {$_.TotalItemSize -lt $TotalItemSizeVariable -and $_.ServerName -notlike "$ServerSearchVariable"} |Select-Object DisplayName
$MBName=Get-Mailbox | Select-Object DisplayName
$FileCompare=Compare-Object $MBStats $MBName -IncludeEqual
$FileCompare | Where-Object {$_.SideIndicator -like "=="} | Out-File $CompareFile
###  Here I am Trimming the file to get it ready for the comparison
Write-Host -foregroundcolor Yellow "Trimming and parsing file"
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "@{DisplayName=", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "}", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "InputObject", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "SideIndicator", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "-----------   ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " --  ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " ==  ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ? {$_.trim() -ne "" } | Set-Content $CompareFile
###     Comparing the Get-MailboxStatistics search to the full list of e-mail addresses and returning PrimarySMTP to setup the text file for the migration
Write-Host -foregroundcolor Yellow Comparing the files and translating to e-mail addresses
$FinalCompare=Get-Content $CompareFile
Foreach ($line in $FinalCompare)
{
    $smtp=Get-Mailbox | Where-Object {$_.Name -eq "$line"} | Select-Object PrimarySmtpAddress
    Add-Content $PrimarySMTP $smtp
}
### Pruning File prior to import
Write-Host -foregroundcolor Yellow "Final Pruning"
(Get-Content $PrimarySMTP) | ForEach-Object {$_ -replace "@{PrimarySmtpAddress=", ""} | Set-Content $PrimarySMTP
(Get-Content $PrimarySMTP) | ForEach-Object {$_ -replace "}", ""} | Set-Content $PrimarySMTP

The above code basically gives you a list of E-Mail addresses based upon the search criteria you set and put it's into the proper format for the New-Migration CMDLet.  The file that is created will look like:

EMailAddress
user1@domain.com
user2@domain.com
user3@domain.com
user4@domain.com
...

Below is the rest of the script (Also Attached).   The first portion of it makes sure that the location of the temp files is clean on the off chance it wasn't prior.  The last portion not only starts the exchange migration, but also cleans up after itself.

#### Cleanup of Previous files if they existed 

    if (Test-Path C:\temp\compare.txt | Where-Object {$_ -eq "True"})
    {
        Remove-Item C:\temp\compare.txt
    }
    else
    {
        Write-Host -foregroundcolor Gray "Compare.txt didn't exist"
    }

    if (Test-Path C:\temp\PrimarySMTP.txt | Where-Object {$_ -eq "True"})
    {
        Remove-Item C:\temp\PrimarySMTP.txt
    }
    else
    {
        Write-Host -foregroundcolor Gray "PrimarySMTP.txt didn't exist"
    }

    if (Test-Path C:\temp\MigrationEmails.txt | Where-Object {$_ -eq "True"})
    {
        Remove-Item C:\temp\MigrationEmails.txt
    }
    else
    {
        Write-Host -foregroundcolor Gray "MigrationEmails.txt didn't exist"
    }

###     SET YOUR VARIABLES FOR THE SEARCH CRITERIA      ####

$ServerSearchVariable="*ex2013*"
$TotalItemSizeVariable="400MB"

###     SET YOUR VARIABLES FOR EXCHANGE ENVIRONMENT     ####
$ExchDB="EX2013-DAG1"
$MigrationName="Under 400 MBv2"

###     SET YOUR VARIABLES FOR THE COMPARE and IMPORT      ####

$CompareFile="c:\temp\compare.txt"
$PrimarySMTP="C:\temp\PrimarySMTP.txt"
$MigrationEmails="C:\temp\MigrationEmails.txt"

###     Do the compare of MBStats based upon Total Item size set above and the server name variable

Write-Host -foregroundcolor Yellow "Running the compare to gather the list of users who will be part of this migration"

$MBStats=Get-Mailbox | Get-MailboxStatistics | Where-Object {$_.TotalItemSize -lt $TotalItemSizeVariable -and $_.ServerName -notlike "$ServerSearchVariable"} |Select-Object DisplayName
$MBName=Get-Mailbox | Select-Object DisplayName
$FileCompare=Compare-Object $MBStats $MBName -IncludeEqual
$FileCompare | Where-Object {$_.SideIndicator -like "=="} | Out-File $CompareFile

###  Here I am Trimming the file to get it ready for the comparison

Write-Host -foregroundcolor Yellow "Trimming and parsing file"
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "@{DisplayName=", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "}", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "InputObject", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "SideIndicator", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace "-----------   ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " --  ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " ==  ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ForEach-Object {$_ -replace " ", ""} | Set-Content $CompareFile
(Get-Content $CompareFile) | ? {$_.trim() -ne "" } | Set-Content $CompareFile

###     Comparing the Get-MailboxStatistics search to the full list of e-mail addresses and returning PrimarySMTP to setup the text file for the migration

Write-Host -foregroundcolor Yellow Comparing the files and translating to e-mail addresses
$FinalCompare=Get-Content $CompareFile
Foreach ($line in $FinalCompare)
{
    $smtp=Get-Mailbox | Where-Object {$_.Name -eq "$line"} | Select-Object PrimarySmtpAddress
    Add-Content $PrimarySMTP $smtp

}

### Pruning File prior to import
Write-Host -foregroundcolor Yellow "Final Pruning"
(Get-Content $PrimarySMTP) | ForEach-Object {$_ -replace "@{PrimarySmtpAddress=", ""} | Set-Content $PrimarySMTP
(Get-Content $PrimarySMTP) | ForEach-Object {$_ -replace "}", ""} | Set-Content $PrimarySMTP

###   SENDING NOTIFICATION MESSAGE
###   Setting Variables for the message   ###

$Smtp = "SMTP SERVER" 
$From = "noreply@DOMAIN.com" 
$CC=""
$BCC=""
$Subject = "Your E-Mail Box is Migrating"  
$Body = get-content C:\TEMP\content.html

#### Now send the email using \> Send-MailMessage  

### IF YOU NEED TO CC or BCC you can comment out the current Send-MailMessage Line and uncomment the one containing the CC and BCC arguments
# Send-MailMessage -SmtpServer $Smtp -To $To -From $From -CC $CC -BCC $BCC -Subject $Subject -Body "$Body" -BodyAsHtml -Priority high 

$NotificationPerson=Get-Content $PrimarySMTP
Foreach ($person in $NotificationPerson)
{
Send-MailMessage -SmtpServer $Smtp -To $person -From $From -Subject $Subject -Body "$Body" -BodyAsHtml -Priority high 

}

###  File pruned, need to added EMailAddress to format import file
Write-Host -foregroundcolor Yellow "Reformating Migration file"
 Add-Content -Path $MigrationEmails -Value EmailAddress
 Add-Content -Path $MigrationEmails -Value (Get-Content $PrimarySMTP)

###     BEGIN MIGRATION   ####
Write-Host -foregroundcolor Yellow "Adding Migration to Exchange 2013"
New-MigrationBatch -Name "$MigrationName" -CSVData ([System.IO.File]::ReadAllBytes("$MigrationEmails")) -Local -TargetDatabase $ExchDB -AutoStart -AutoComplete

Write-Host -foregroundcolor Yellow "##################################"

    if (Get-MigrationBatch -Identity "$MigrationName" | Where-Object {$_.Identity -like "$MigrationName"})
    {
        Write-Host -foregroundcolor Yellow "Migration Batch of $MigrationName has started"
    }
    else
    {
        Write-Host -foregroundcolor Yellow "$MigrationName did NOT START"
    }

Write-Host -foregroundcolor Yellow "##################################"
Write-Host -foregroundcolor Yellow "Cleaning Up Files"
Write-Host "Starting sleep to allow upload."
Start-Sleep 30

###   CLEANUP FILES

#Remove-Item $CompareFile
#Remove-Item $PrimarySMTP
#Remove-Item $MigrationEmails

#Write-Host -foregroundcolor Yellow "$CompareFile , $PrimarySMTP , and $MigrationEmails were removed"
Write-Host -foregroundcolor Yellow "COMPLETE"

ExchangeMigrationWeb.ps1

List All Users in an OU

powershell1.jpg

Overview:

There may be a time when you need to list all the users of a specific OU, not just the entire domain.  An example, is if your Organizational Units were broken down in departments and you wanted to compare that department to an active employee roster for instance.  Other times, you may seperate your OU's into one that contains your company accounts vs. service accounts.

Knowledge:

For this case, you can use the Get-ADuser commandlet in order to do so.  Below is a screenshot of the default properties shown, and a second showing the extended properties.

Extended Properties:

Example Command:

This command will search the OU Company in the Domain Example.com, It will also select just the Surname, and Given Name sorting by the Surname.

Get-ADUser -Filter * -SearchBase "ou=company,dc=example,dc=com" | Select-Object Surname, GivenName | Sort-Object Surname

If you wanted to PIPE it out to a CSV file so you can open in Excel you can do that by using the Export-Csv command:

Get-ADUser -Filter * -SearchBase "ou=company,dc=example,dc=com" | Select-Object Surname, GivenName | Sort-Object Surname | Export-Csv c:\temp\AllUsers.csv
References:

TechNet Wiki Get-ADUser

Exchange 2013 - DAG - Failed And Suspended

Overview:

Happen to be in the Exchange Control Panel and noticed that on our DAG it was listed as "Failed and Suspended" for the status of one of the members. I was perplexed that we didn't catch this from our monitoring or any where else, but that's a whole other issue.  My concern here was it was obviously failed.  Attempting an Update or Resume resulted in no feedback in the ECP and no change in Status.

Troubleshooting.

In this case, this is the message I saw when running Get-MailboxDatabaseCopyStatus in the EMS:

It was here, that as I mentioned above, that the status wasn't changed upon attempting to update or resume the database copy.  I attempted an Update-MailboxDatabaseCopy as one would assume would reseed the database, I even added the -DeleteExistingFiles switch specifically to start from scratch, yet recieved "The seeding operation failed... ...which may be due to a disk failure"

At this point, one would have expected I assumed there was a disk problem.  Having said that I checked to make sure the disk was mounted, even browsed it and assumed this was a typical non-descript error message.  At this point, I decided that the beauty of a DAG and having multiple copies is that I could just "whack" the DB copy and reseed from scratch.  I went through the process of Removing the mailbox DB copy by doing the Remove-MailboxDatabaseCopy:

That proceeded as expected. However as the message states I went to clear the items (specifically the logs) manually and strangely received an error message "Remove-Item: The file or directory is corrupted and unreadable"

At this point, I was surprised and decided that there actually had to be a disk issue. I browsed manually back to the location and attempted to delete a log file manually and received the same popup within Windows.  I was amazed, I actually had a disk problem. This was only strange to me because our underlying disk is actually an NetApp LUN. That LUN actually holds all three DB Copies from each of the three servers in this instance.  So for one disk to be corrupted and not all three (First off Thank God!) I was miffed.  At this point I went ahead and formatted both the Drive that contained the EDB, and the LOG files.

After confirming that the DB Copy Status didn't show the original copy still I went ahead and ran the Add-MailboxDatabaseCopy command to reseed form scratch a copy of the DB.  Wella, it worked and began copying over.

The WHY:

I suspect from looking at the log dates on the server and the time that it was last inspected that it relates to a power outage we sustained.  About 3 weeks back we had a situation where we were getting bad power from both GRIDs that fed our building, and datacenter UPS.  After dealing with bad power, our Emerson UPS decided it had enough and was toggling between battery and no battery power.  Because it was toggling so frequently it actually depleted the batteries.  Despite knowing that we left our systems up while they charged since power seemed to be okay, no flickers, nothing.  Newton struck and before the batteries had enough juice to hit sustain a brown out moment,

Exchange 2013 - Custom DLP Sensitive Information Rules

Overview:

Recently found the need to filter, or at least be aware of e-mails being sent that contained specific information.  An example would be legal matters, where certain information shouldn't be e-mailed outside of the company.  Creating a Sensitive Information Rule and combing that with a Data Loss Prevention Policy, you can have that information blocked, or at least the appropriate person or persons notified.

Overall this scenario came about with regards to building a better internal auditing system.  Creating the DLP alone isn't the only thing needed to build a more complete picture of what "users" may be doing, but only a piece of the puzzle.  In most cases you need to combine it with at least File Server auditing, and local workstation auditing to build the larger picture.

The How:

Creating and importing custom Classifications

  1. First you need to create your custom policy XML
  2. Save as XML Unicode UTF-8 file with an extension of XML.
  3. Open the XML in internet explorer if its formatted correctly you will see the XML.
  4. Then import with Powershell New-ClassificationRuleCollection –FileData ([Byte[]]$(Get-Content -path INSERT YOUR PATH -Encoding byte -ReadCount 0))
  5. Once its imported you should be able to create a new DLP policy using the EAC

Creating a custom DLP Rule

  1. Login to EAC (i.e https://mail.domain.com/ecp)
  2. Click Compliance Management, data loss prevention
  3. imageimage
  4. Click the Plus , then New custom policy
  5. image
  6. Name your policy and Choose your mode (I like to test with Policy tags), and click Save
  7. image
  8. Select the policy and click the edit your new policy
  9. Select Rules from the left
  10. image
  11. Click the to Create a new rule
  12. On the Apply this rule if field choose The message contains Sensitive information..
  13. Click *Select sensitive information types….. (if applicable)
  14. image
  15. Click the to choose from the list,
  16. You should now see your new classification

Useful Tools

The one thing I noticed that caused some issues from other examples such as: http://technet.microsoft.com/en-us/library/jj674703%28v=exchg.150%29.aspx and http://exchangemaster.wordpress.com/2013/05/15/creating-custom-dlp-classification-rules-and-policy/ is that they mention UTF-16 in the header, as well as TechNet uses a command block. I found that using either example caused an error upon import via powershell.  Notice the difference in my example below that I had to switch it to UTF-8 to get powershell to even read the XML.

Need to make sure you replace the below GUID's with self created ones form above.

<?xml version="1.0" encoding="utf-8"?> <RulePackage xmlns="http://schemas.microsoft.com/office/2011/mce"> <RulePack id="797f6b49-682c-42e4-8577-aac6eadd1428"> <Version major="2" minor="0" build="0" revision="0"/> <Publisher id="1a2d8dc3-075b-4ad5-8116-20e90314ade2"/> <Details defaultLangCode="en-us"> <LocalizedDetails langcode="en-us"> <PublisherName>Aaron Bianucci while at FHP</PublisherName> <Name>Test Keyword</Name> <Description>This is a test rule package</Description> </LocalizedDetails> </Details> </RulePack> <Rules> <Entity id="365fa6fb-9a59-4750-b82f-14647b382319" patternsProximity="300" recommendedConfidence="85" workload="Exchange"> <Pattern confidenceLevel="85"> <IdMatch idRef="Regex_Exchange" /> <Any minMatches="1"> <Match idRef="Regex_DLP" /> <Match idRef="Regex_2013" /> </Any> </Pattern> </Entity> <Regex id="Regex_Exchange">(?i)(\bExchange\b)</Regex> <Regex id="Regex_DLP">(?i)(\bDLP\b)</Regex> <Regex id="Regex_2013">(?i)(\b2013\b)</Regex> <LocalizedStrings> <Resource idRef="365fa6fb-9a59-4750-b82f-14647b382319"> <Name default="true" langcode="en-us"> Test Rule Pack AMB </Name> <Description default="true" langcode="en-us"> Test rule pack - Detects Aaron Drone </Description> </Resource> </LocalizedStrings> </Rules> </RulePackage>

Enterprise Windows Lock Screen Customization

images.jpg

[fusion_text]Lately we have been attempting to reign in the personal customization of Corporate Resources, not because of being authoritative but to provide a unified presence. Initially we were just going to do the computers that are "Public", such as digital signage, conference rooms, but being that it had a great reception we decided to go all the way through all computers. In our case, we have some legacy XP machines and then a mixture of Windows 7\8. Each OS has it's own process, albeit similar

Windows XP:

Create WMI Filter for GPO:
  1. Open the Group Policy Editor and proceed to WMI Filters:
  2. Create a new Filter, I titled mine "Lockscreen - Windows XP"
    1. Namespace: rootCIMv2
    2. Query: select * from Win32_OperatingSystem where (Version like "5.1%") and ProductType="1"
Create the GPO:
  1. Create and Link a GPO to the OU Where the workstations reside.I called mine Lockscreen - Windows XP so that as you create them for the other OS's it's easy to know which is which.
  2. Go to: Computer Config \ Preferences \ Windows Settings \ Files
    1. Create a File Action with the following information
    2. Action: Replace
    3. Source: the Network or Accessible location for the original BMP File. ( *** Must be a BMP file for XP ***)
    4. Destination: Local location. I tend to put everything for the company in c:\COMPANY. In this case I also made the filename Lockscreen.bmp so that it is always overwritten as it's changed thus not filling up the HD.
    5. In order to apply the the LockScreen you need to add a registry setting.
      1. Action: Update
      2. Hive: HK_Users
      3. Key Path: .DEFAULT\Control Panel\Desktop
      4. Value Name: Wallpaper
      5. Value Type: REG_SZ
      6. Value Data: Path in step 2d for the local file

Windows 7:

Create WMI Filter for GPO:
  1. Open the Group Policy Editor and proceed to WMI Filters:
  2. Create a new Filter, I titled mine "Lockscreen - Windows 7"
    1. Namespace: rootCIMv2
    2. Query: select * from Win32_OperatingSystem where (Version like "6.1%") and ProductType="1"
Create the GPO:
  1. Create and Link a GPO to the OU Where the workstations reside.I called mine Lockscreen - Windows XP so that as you create them for the other OS's it's easy to know which is which.
  2. Go to: Computer Config \ Preferences \ Windows Settings \ Files
    1. Create a File Action with the following information
    2. Action: Replace
    3. Source: the Network or Accessible location for the original PNG or JPEG
    4. Destination:c:\windows\system32\oobe\Info\Backgrounds\backgroundDefault.jpg The path is critical here in Windows 7. It has to be the above location and the file name has to be as stated.
    5. Now we have to enable the "OEM Background" by creating a registry item under WIndows Settings.
      1. Action: Update
      2. Hive: HKEY_LOCAL_MACHINE
      3. Key Path: SOFTWARE\Microsoft\Windows\CurrentVersion
      4. Value Name: OEMBackground
      5. Value Type: REG_DWORD
      6. Value Data: 00000001
      7. Base: Hex

Windows 8:

Create the GPO:
  1. Create and Link a GPO to the OU Where the workstations reside.I called mine Lockscreen - Windows 8 so that as you create them for the other OS's it's easy to know which is which.
    1. Go to: Computer Config \ Preferences \ Windows Settings \ Files
    2. Create a File Action with the following information
    3. Action: Replace
    4. Source: the Network or Accessible location for the original PNG File.
    5. Destination: Local location. I tend to put everything for the company in c:\COMPANY. In this case I also made the filename Lockscreen.PNG so that it is always overwritten as it's changed thus not filling up the HD.
    6. Computer Config \ Policies \ Admin Templates \ Control Panel \ Personalization
      1. Configure "Force a specific default lock screen image" to Enabled with the path mentioned above
        1. Personally I also chose to enable "prevent Changing Lock Screen Image as well. This stops the end user from also switching it.
        2. Another optional setting I also chose to implement was changing the start menu background color. In our case we are trying to go for a "look" or image. By default it would be setup however the user had set it up. Thus your image could be a black background with white lettering. If they have an alternate monitor and the end user selected purple you would have your LOGO with a purple screen vice something that "flowed". In this case you can:
          1. Computer Config \ Policies \ Admin Templates \ Control Panel \ Personalization
          2. Select Force a specific background and accent color. You can use HEX codes to set a matching color.
          3. Secondly I sent to Force a specific Start Background. Either pick the same matching color, or chose 20 for transparent so it uses the desktop.

[/fusion_text]

Logicmonitor - Network and Server Monitoring

2015-03-30_15-33-111.png

[fusion_text]As an IT, Systems or Network Administrator you need to be able to stay ahead of and be the first to know when something in your network isn't functioning as it should.  This could be something simple from web services, or something more advanced like a load balancer. Either way it's critical to be alerted and notified when an issue arises.  Logicmonitor does all of that, at a great price point, and with an extremely quick install right out of the box.

Price:

All though this may vary depending on what you need to monitor, it's based (as of this writing) on a monthly subscription model.  Unlike other monitoring solutions which is a "pay per monitor", or straight enterprise agreement, Logicmonitor has a flat fee per month per host.  That host, again, could be as simple as a DNS Host where you want to monitor DNS availability and response times, or it could be something more advanced such as an Exchange Server.  Either way the cost is the same per host per month.  Personally I find this to be a great thing, there are many times where you have your basic functions you may want to track such as CPU, Memory, Network bandwidth plus the function of the server.  In a DNS Server situation that could easily cost "4 Monitors" by others standards.  Then when you consider and exchange server where you want at least about 20 monitors, the cost could add up quickly.

Installation:

This is about as painless as it gets.  A simple install of their collector software on a machine that you would like to use internally gets this ball rolling quickly.  The machine that it is installed on should have access to SNMP, WMI and PerfMon.  Depending on your device, that configuration could be different.  For instance typically on Cisco devices you have to "Allow" SNMP from a specific IP address.   Regarding WMI and PerfMon the user account in which the collector is installed on should have rights to view associated Windows information.  Another "perk" of the collector is you can have more then one, and in more then one location. As mentioned above, LM is licensed per host, not per location.  Thus if you have enough machines to require multiple collectors in one location you can do that.  If you have multiple offices, such as my situation, you can have a co-located collector as well.

Performance:

Have yet to have an issue here.  By recommendation, if you are going to use NetFlow, it is recommended to have a collector in itself for that task since it's resource intensive.  NetFlow Aside we have around 40 hosts running off one collector without an issue.  The only time I have ever noticed an issue was pulling data from windows machines that have high CPU for long periods of time.  This is a common problem with PerfMon Counters, so when possible make sure you are using WMI.

Custom Counters:

With the use of scripts you can actually have custom counters or monitors as well.  For instance, natively, Exchange doesn't give you a WMI or Perfmon counter for the state of a DAG member. However that information is able to viewed through Powershell.  With that you can create a Powershell script that returns a value for the state of the DAG member and have LM track and alert on that information:.

Overall LogicMonitor is a great monitoring solution.  The ability to work right out of the box, to customize, and be at a great price point.  It should be consider as an option for all.[/fusion_text]

Team Collaboration and Project Management Multimedia Area

img_5519850cb3d50.png

[fusion_text]

Was happy to finish a midsize collaboration area today. Essentially this room in its current use is setup for a project team to collaborate in bidding for construction work. However the use, or the ability of use is the same for a standard conference, team collaboration or Project Management hub. The key concept here is to incorporate technology into the room enough to be helpful, easy to use and not distracting. Depending on the use at the moment there are basically 4 areas or "configurations".

Below in this image there is a laptop that has it's screen duplicated to the right most TV monitor. The public PC with Dual monitors has one screen duplicated to the other TV, while the third monitor is using it's own PC at the moment, in this case to display Google Earth.

 img_55198545df4b0

img_55198545df4b0

 img_5519853c6a864

img_5519853c6a864

Here is a drawing illustrating the actual connections and relative locations of the items. The cabling is actually done in 2" conduit either below the floor, or in the walls as to hide most of the cabling. (Assuming it gets tucked behind the TV after testing (smile) )

 img_5519855301152

img_5519855301152

As illustrated there are multiple connection types. The local PC is connected via HDMI, and there are two VGA Cables on the desk for each monitor. Having said that the Large Format Displays from NEC can take a second HDMI, Displayport, DVI or component. In this case our enterprise laptops are most VGA or HDMI. More importantly this room has a singular functions so we didn' t pull all the "potential" options just what was required. COMPONENTS: Table setup: This setup is a standard HP Elite Workstation that is our standard issue computer. Only difference was is we added an off the shelf ATI Video card that could support multiple (in this case 6) full 1080p monitors. This is what covers the 2 HP IPS Monitors on the desk, and the 2 NEC LFD's on the wall. Using the diagram above as a reference we took the Left Monitor (Blue) and have it duplicated to the Left LFD (Also Blue). The Orange and Green Displays are "Extended Desktops". Generally in the workflow that happens in these areas we have one person "running" things who is sharing a screen (Blue), but also has a screen (Orange) for e-mail and other "non shared items". The Green Screen is more of an "extra" at this point. This allows for a second person to use a laptop plug in and share, or for the conference room PC to place something as reference up on that end.

Large Format Displays: These are NEC V463 displays. They are commercial grade which gives us a 46" LED Display with 10W Speakers. We use these not only in our spaces such as this but also as our Digital Signage through out our locations.

Touch Screen Setup: This is the best part of the room. There is a Perceptive Pixel Display from Microsoft, with an Intel NUC PC attached to the back. Given that the Intel NUC is running Windows 8.1 this 55" Touch Screen monitor becomes a 55" Surface Tablet. It has full 10+ multi-touch points, and a Pen that allows for manipulating the screen. Anything you have read about the surface or windows 8 in a touch screen environment will apply to this beast. Secondly to that the monitor is stunning in it's picture and responsiveness. Only downside is the 2 DVI input ports vice HDMI, Displayport. In this specific case it's limited, however with some cable converters you can make it work with any digital input. For audio we picked a standard Visio Soundbar. Sound isn't included in the display so if you were going to need sound, this is a must. In our case, with Lync 2013 for Meetings, sound was important as we can leverage any off the shelf Windows 8 Supported camera for Video Conferencing.

 img_55198567d5cf7

img_55198567d5cf7

 img_5519850cb3d50

img_5519850cb3d50

Overall this is a pretty good setup. It allows for Video Conferencing via any Windows 8 supported service (we use Lync) on the Perceptive Pixel display, while at the same time being collaborative both inside the room and remotely.[/fusion_text]