Thursday, September 19, 2013

Is It a VM?

Having a hard time keeping track of which machines on your network are virtual?

I have just the script for you. I wanted to have a way to quickly find out what machines were virtual at work, so I whipped this up in PowerShell today.

Param(
    [string]$ComputerName = 'localhost',
    [string]$ComputersListFile
)

if ($ComputersListFile) {$computers = Get-Content -Path $ComputersListFile} else {$computers = $ComputerName}
if ($ComputerName -eq'localhost')
    {
        if((Get-WmiObject -Class win32_bios).SerialNumber.StartsWith("VMware")) {Write-Output "This machine is a VM!"} else {Write-Output "This machine is probably not a VM"}
    }
else
    {
        invoke-command -ComputerName $computers -ScriptBlock {$hostname = hostname; if((Get-WmiObject -Class win32_bios).SerialNumber.StartsWith("VMware")) {Write-Output "$hostname is a VM!"} else {Write-Output "$hostname probably is not a VM"}}
    }

This only works as it is for VMWare virtualization, not for Hyper-V or VirtualBox, but it would be easy to add tests for those too. The test is to look at the BIOS SerialNumber and see if it starts with the string "VMware". It's just that simple. ...You could also look at the MAC address of the NIC and see if the first few digits match one of the VMWare prefixes, but the BIOS string was easier.


Run it like this:

.\IsItAVM.ps1 -ComputerName computer1, computer2, computer3

or like this if you have a list of computers already in a file:

.\IsItAVM.ps1 -ComputersListFile U:\computers.txt

Of course, this will only work if the computer you are testing is Windows with PowerShell and has PowerShell remoting enabled, and you are not blocked by a firewall... but in an IT environment where PowerShell is being used for management this fits nicely.

Have fun!

--
Edit: Added a little bit of code to better handle the localhost case. If you just run it as .\IsItAVM.ps1 or manually specify just localhost in the -ComputerName then it will not try to use Invoke-Command, so it will not need remoting turned on for a localhost check. Also it gives a slightly different message on a localhost check.

Wednesday, September 11, 2013

Why You Want Your Next File Server To Be Win2012 - Dedup

In the teaser post I showed you this image. It is a little bit misleading. My 722GB of data actually occupies 483.72 GB of disk space.
To compress it down to less than 2gb would require a type of black magick even Microsoft isn't able to produce.


I am using Windows 2012 data deduplication, which is a new technology that saves disk space by looking at each block on the disk and if there are multiple blocks that are the same it only saves one copy. So, if you have several copies of the same file, even if they are a little bit different from each other, or if you have files with lots of repetition (like log files), it can save you lots of space.

Because I'm using Windows data deduplication, the size on disk only shows the size of the metadata. With PowerShell's Measure-DedupFileMetadata command I can get the actual space used.




Adding the SavedSpace shown with Get-DedupStatus to the DedupSize shown by Measure-DedupFileMetadata indeed does add up to 721.26... just short of the Size of 722.68GB reported by the Measure-DedupFileMetadata and 722GB reported by the GUI. Given a margin of error on that calculation to account for rounding, that seems right to me.

Ok, so it didn't shrink it down to 2GB like the teaser might have lead you to think, but shrinking 722GB down to 484 GB is pretty impressive still, that's about a 33% savings. With a volume size as large as this (yup that says 20TB, but it's not real, we'll get to that in a later post) NTFS can no longer do file compression (NTFS file compression is not possible on drives that have a larger cluster size than 4K), but the new data deduplication applied at the volume level makes decently efficient use of space.

By now you are probably starting to see why it's suddenly important to start learning about PowerShell, if you haven't already started. The Windows GUI will no longer be sufficient to properly administer newer versions of Windows. For the past several years it was necessary for Exchange Server admins to get the most out of that product, and for Server Core editions of Windows Server to be manageable at their own console, but now newer features like Disk Deduplication and Storage Pools, require it in order to get the most out of these features. Much like Linux and Cisco, Windows is headed to an age where those who understand it's command line will be able to do much more, and do it much more efficiently, than those who only learn it's graphical interface.


So how do you go about setting up deduplication?

For starters, you need a separate disk from your OS boot disk. (you can't dedup c:)

You can install it via the old fashioned GUI:
Go into Server Manager
Click "Manage"
Click "Add Roles and Features"
Work your way down to "Server Roles"
Under "File and Storage Services", enable "Data Deduplication"


or, just open PowerShell and type:
"Import-Module ServerManager"
"Add-WindowsFeature -name FS-Data-Deduplication"

Enable it on a volume via the GUI:

In the Server Manager, select "File and Storage Services", and then "Volumes".
Right click on a volume, and select "Configure Data Deduplication"



or, via PowerShell:
"Enable-DedupVolume M:"

In the next post I will get into the details of Storage Pools. This is a really neat feature of Windows 8 and 2012 that puts the old RAID system to shame.

Tuesday, September 10, 2013

Why You Want Your Next File Server To Be Win2012 - Teaser


Ch-ch-ch-Changes...

There are going to be a few changes around here.

I realized that try as I might, I have not been sticking to the "Practical Tech Tips and Reviews for Everyday Users" in my tag line. In fact I don't think in the 8 years this blog has been going, I have ever written a review.

That tag line has limited what I feel is appropriate to post to this blog, which has meant that some things I might like to post never get posted for fear of scaring away the "Everyday Users" ...but then what I do post is often too technical for that demographic anyway. This week marks a change in the focus of this blog. From now on, I'm just dropping the tag line all together.  I will make no assumptions about who my audience is, and will post things strictly according to what I think is interesting enough, or important enough to share.

I will still try not to just repeat what you can find elsewhere on the internet. Inevitably I will talk about subjects that are being talked about somewhere else, but weighing in heavily with my own views, opinions and experiences. Some times things might get more technical around here than you are used to seeing here, but other times I might just share fun stuff I've found.

I hope that most of my readers will stay, but I am not writing this strictly to entertain one group of people, but because I have some things I'd like to share, and I will share with whomever is still listening.

Visible changes in the immediate future will be limited to just the logo at the top of the screen will lose that tagline. Over time though, I expect the blog content on average will get more technical, but not overwhelmingly so for anyone who already considers themselves a techie or advanced user.

Friday, September 06, 2013

SysInternals

It's been a while since I last posted something truly a tech tip for everyday users.
This is not something ground breaking and way out in the realm of the really techie user. Every user who is even moderately techie ought to already know about this, but for the enthusiast home user or future IT pro who hasn't yet run across Sysinternals (formerly WinInternals) you are really missing out.

Sysinternals has long provided really useful free utility programs for Windows that run with a small footprint and don't leave a lot of mess behind them after they run. You don't normally have to "install" them, just run them.

BgInfo is a neat little tool that will post some useful details about your computer in the corner of the screen so that you have a quick reference for which version of Windows you are using, What CPU your computer has, how much memory, and other details a tech support agent might ask about.

Desktops allows you to organize your applications on up to four virtual desktops so you can quickly switch between groups of applications. Put your Excel spreadsheet and bank stuff on one virtual screen, and facebook and twitter on another...

If you are making presentations often, or have a visual impairment, ZoomIt is a screen zoom and annotation tool designed for technical presentations, but generally useful for anyoone who wants to zoom in on stuff once in a while.

If you have ever run defrag and wondered how to defragment those system files that defrag can't touch, well Contig is there for that purpose.

For the advanced users, some of these tools are particularly useful in troubleshooting problems or investigating security issues. For example, if you needed to know what process is making outgoing connections to a certain IP that you suspect is related to malware, TCPView might help you track down just what program on your computer is doing that. Process Monitor (which is a replacement for two legacy Sysinternals utilities, Filemon and Regmon) can help you track a program's every move.

Some of the tools, like the PS tools and AD tools will only be of interest to pros working in an office environment.

Here is a quick free video course for those interested in some of sysinternals' more advanced tools. (you need a Windows Live/Hotmail login to get there, but that's free too).
http://www.microsoftvirtualacademy.com/training-courses/utilizing-sysinternals-tools-for-it-pros