Getting PowerShell 6

PowerShell 5.1 marks the last major update to PowerShell we are likely to see built into Windows. The future of PowerShell has gone the open-source path with PowerShell 6 being available via GitHub and available not just for Windows but also multiple Linux distributions and MacOSX. This is made possible as PowerShell 6, or rather PowerShell CORE 6.0 is built on .NET Core (which is cross platform) instead of the Windows exclusive .NET.

The good news is PowerShell 6 can be installed alongside the PowerShell that is part of Windows/WMF. Download and install from https://github.com/PowerShell/PowerShell/releases. Once installed you can launch by running pwsh.exe. If you look at $psversiontable you will see you have the Core PSEdition instead of the standard Desktop.

I recommend installing this and running alongside the regular PowerShell and getting used to it. The good news is most regular PowerShell will run and if you execute get-module -listavailable you will see the built-in modules. For non-built in modules you will need to check if they are supported with PowerShell Core.

Read the Microsoft article at https://blogs.msdn.microsoft.com/powershell/2018/01/10/powershell-core-6-0-generally-available-ga-and-supported/ for a great overview and walks through the key features that are not part of PowerShell Core 6, e.g. workflows in addition to key considerations.

Tools like Visual Studio Code can be used with PowerShell 5.1 and PowerShell Core 6.0. Simply change the settings for Visual Studio Code to add pwsh, e.g. add the following to user settings (File – Preferences – Settings) (change to your specific PowerShell version). I added this just under the existing User Setting (the , goes after the existing line in the file).


Happy PowerShelling!

Understand precedence with PowerShell

There are many ways to create functionality in PowerShell including basic cmdlets, aliases and functions. When you use multiple combinations its important to understand the precedence. This is best understood by walking through a basic example.

Firstly just run:

get-process

This will result in processes being displayed as expected.

Now lets create a function called get-process that lists child items.

function get-process { Get-ChildItem }

Now if you run get-process it will show child items so the function trumps the built-in cmdlet.

Now let’s create an alias so get-process points to get-service.

New-Alias get-process -Value get-service

Run get-process and it shows services so an alias trumps a function (which trumps the native cmdlets).

Note, you can always force running the cmdlet by its full name.

microsoft.powershell.management\get-process

Once you’ve finished you can reverse by deleting one-at-a-time.

 

Email people via Office 365 from PowerShell when passwords about to expire

I have a demonstration environment where many users have accounts but they never logon to AD directly nor look at this demonstration email mailbox. They only use the environment via Azure AD where they logon at Azure AD via the replicated password hash. Because of this they don’t get password expiry notifications and continue to logon however if they try and access something that does hook into AD and not Azure AD the logon fails.

They wanted to be emailed of upcoming password expiry to their real-email. To accomplish this their real email was stored in extensionAttribute10. I didn’t use the proxyaddresses as this may have SIP information. This attribute could be easily set with:

I had a mailbox for a core process I use. Now that user has no other rights so I placed the password in the script but that’s not ideal at all. If this was Azure Automation I could have used a credential object, I could have at least made the password harder to read by creating an encrypted version of the password and then storing that in the file (but its still reversible, just slightly harder to glance at!), e.g.

However the account can’t do anything except email and access to the script location was highly restricted so I left it as text which was also easier to demonstrate below however in my environment I used the alternate approach above just to make it a little harder to get the password on glance :-). Replace this with your own email and password.

The script looks for any password expiring in less than 10 days and emails a simple message. Customize as you like! It has a basic HTML block with a placeholder (MESSAGEHOLDER) that is replaced by a custom string for the user.

Have fun!

Add group members to another tenant via Azure AD B2B and PowerShell

I needed to add members of a number of groups from one Azure AD tenant to a group in another Azure AD tenant that would then be given access to a resource. The goal was to not require the users added to have to redeem the invite which is common when adding a B2B user. To do this the first step was a user invited via B2B the normal way, that user redeemed the invite and in this case was then made a global admin (although another option would have been to enable guests to invite guests). The key point was this user had the ability to invite people via B2B and could enumerate users in the invited Azure AD instance which would mean invites would not have to be redeemed.

My first version of the script was very simply however I soon realized I would have to rerun the script to add new users and so I enhanced it to extract the current members of the group, convert to regular email format (since when invite to Azure AD the users have @ replaced with _ and is put in a string with various components separated by a #). The script therefore extracts the first part and converts the _ back to a @. Then looks for only for people who are not already members.

In the script below replace the group names, Azure AD names and IDs to meet your requirements.

 

 

Using the Azure PS Drive

If you leverage the Azure Cloud Shell in the Azure portal its a very convenient way to manage Azure resources using PowerShell and the CLI but you may have also noticed an actual Azure drive, i.e. Set-Location azure: and you can navigate around your Azure resources (this is actually the default location when the cloud shell opens). At the top level are subscriptions and you can then navigate to resource groups, VMs, WebApps and more.

The Azure drive is provided via the Simple Hierarchy in PowerShell (SHiPS) provider which you can see via Get-PSProvider.

The actual functionality is evolving, its a project on GitHub at https://github.com/PowerShell/SHiPS but this also means you can run this same provider outside of the Azure Cloud Shell.

You need to ensure you are running the latest version of the AzureRM module then download, install, add an Azure account and add the provider:

You can now navigate to Azure: and enjoy the same feature as when in the Azure Cloud Shell.

Note this is completely different from the Azure Cloud Drive which is the persistent file storage you have in the Azure Cloud Shell that is backed by Azure Files and enables data to be saved and used between sessions. Use Get-CloudDrive to see the current configuration and if you wish to change it simply run Dismount-CloudDrive and then restart the shell and select Advanced options to customize the location.

A whole bunch of stuff with Service Manager

It started out very simple. I was preparing for a client that wanted to allow business groups to schedule the deployment of patches to their servers via Configuration Manager using Orchestrator and Service Manager with some approval workflow. This seemed the right approach and I decided to quickly setup a little example of what this would look like but instead of using Configuration Manager I decided to just be able to run any PowerShell command on the computers in the passed Active Directory group (which contained the computers for that business group). The final solution is walked through at http://youtu.be/RFLhyMhJOOc and I have a bunch of assets listed below:

The example runbook is available at http://www.savilltech.com/videos/RunActivity.ois_export

The example PowerShell is available at http://www.savilltech.com/videos/RunCommand.ps1

The example email approval template is available at http://www.savilltech.com/videos/RequestApproval.txt

I wanted to briefly go over the key steps here that are walked through in the video:

Firstly I decided to do the actual logic of finding all the machines in the passed AD group using PowerShell and within the PowerShell actually perform the actions. You could have done this with Service Manager activities but the direction is PowerShell so I decided to implement the main logic in PowerShell but use Orchestrator activities for interacting with Service Manager such as update Service Request status etc. The basic PowerShell is below:

$ADGroupName = “FileServicesGroup”
$CommandToRun = “mkdir c:johntest”

$ScriptBlockContent = {
param ($CommandToRun)
Invoke-Expression $CommandToRun }

$ADGroupMembers = Get-ADGroupMember -Identity $ADGroupName
foreach ($computer in $ADGroupMembers)
{
#Run the required command on each computer in the group
Invoke-Command -ComputerName $computer.name -ScriptBlock $ScriptBlockContent -ArgumentList $CommandToRun
}

In this basic PowerShell I am hardcoding the AD group and the command to run but when it gets into Orchestrator I’ll replace that with the information sent from Service Manager. Note I use a parameter to pass to the scriptblock since the actual variable is local to my PowerShell session and NOT the one I create remotely on each machine.

I then create a runbook in Orchestrator which essentially uses this PowerShell which also hooks into Service Manager to update the Service Request status.

In Service Manager I synchronize with Orchestrator to get the Runbook in the library then create a Runbook Automation Activity template and then a new Service Request template which has two activities. The first is a review activity which is for the submitting users line manager (as defined in Active Directory) and the second calls the Runbook Automation Activity template and defines the Runbook ID (RBId) and the Service Request ID (SRId) which is the Work Item – ID for each of the respective values. This is shown below.

IdMappinginSRTemplateforRBActivity

Because when this all runs via Orchestrator it runs as the Orchestrator service account I made the Orchestrator service account a local administrator on each server using the Group Policy Preferences local group feature and added the service account. Other options could have been to specify a credential as part of the actual Invoke-Command within the PowerShell but I liked the Group Policy option.

gpoadmin

I wanted to use Office 365 and I walk through the basic steps in a FAQ at http://windowsitpro.com/service-manager/connect-service-manager-office-365

I created email templates for the review activity and notification the Service Request is complete and then configured those to be used as part of the workflows.

I created Request Offerings and Service Offerings and the key point is for end users to see them I created new Catalog Groups, one for Request Offerings and one for Service Offerings that dynamically added all published of each type. I then created a new User Role that was for all Domain Users that included both those new Catalog Groups.

I then walk through it all!

AGAIN DO NOT DEPLOY THIS IN PRODUCTION.

THE ABILITY FOR ANY USER TO RUN ANY COMMAND ON EVERY SERVER IN A PASSED GROUP IS VERY BAD. THIS WAS AN EXAMPLE ONLY !!!!

This is an example only and you should modify the actions performed in Orchestrator to your own specific needs.

And the video:

New Storage Spaces in Windows Server 2012 R2 Video

In the video below I whiteboard the new features of Windows Server 2012 R2 Storage Spaces and then perform a full demo using PowerShell in my lab. http://youtu.be/x8KlY-aP9oE

Below is all the code I use.

New video on the new features of PowerShell 3.0

Sat down this morning and created a new video on the major new features of PowerShell 3.0. It’s uploaded to YouTube and available at http://youtu.be/aQgptbNOBBo. Plus the main code I use throughout the video so you can try for yourself!

#region CIM
Get-Command -Module CimCmdlets
Get-CimClass -ClassName *disk*
Get-CimClass win32* -MethodName Term*
Get-CimInstance Win32_Process
#endregion

#region Simplification example
Get-Process | where {$_.HandleCount -gt 900}
Get-Process | where {$psitem.HandleCount -gt 900}
Get-Process | where HandleCount -gt 900
#endregion

#region Robust sessions
$RemoteSession = New-PSSession –Name Server1Session –ComputerName savdaldc01
Invoke-Command –Session $RemoteSession –ScriptBlock {$date = Get-Date}
Disconnect-PSSession –session $RemoteSession

#This would then be run on the savdaldc01 machine locally showing state not lost
Get-PSSession –ComputerName localhost
$LocalSession = Connect-PSSession –ComputerName localhost  –Name Server1Session
Invoke-command –Session $LocalSession –Scriptblock { $date }
Get-PSSession –ComputerName localhost | Remove-PSSession
#endregion

#region Workflow
Workflow MyWorkflow {Write-Output -InputObject “Hello from Workflow!”}
Get-Command –Name MyWorkflow –Syntax
MyWorkflow

Workflow LongWorkflow
{
Write-Output -InputObject “Loading some information…”
Start-Sleep -Seconds 10
CheckPoint-Workflow
Write-Output -InputObject “Performing some action…”
Start-Sleep -Seconds 10
CheckPoint-Workflow
Write-Output -InputObject “Cleaning up…”
Start-Sleep -Seconds 10
}

LongWorkflow –AsJob –JobName LongWF
Suspend-Job LongWF
Get-Job LongWF
Receive-Job LongWF –Keep
Resume-Job LongWF
Get-Job LongWF
Receive-Job LongWF –Keep
Remove-Job LongWF
#endregion

#region Background job
$Trigger = New-JobTrigger –Daily –At 2am
Register-ScheduledJob –Name MyScheduledJob –ScriptBlock {Get-Process} –Trigger $Trigger
Get-ScheduledJob
(Get-ScheduledJob –Name MyScheduledJob).JobTriggers
Get-ScheduledJob –Name MyScheduledJob
Unregister-ScheduledJob -Name MyScheduledJob
#endregion

#region Misc commands
Get-Command *disk*
Get-Module Get-Disk
Get-Module Show-Command
Get-Command | sort Module | out-gridview
#endregion