Quantcast
Channel: PowerShell
Viewing all 1519 articles
Browse latest View live

DSC Resource Kit Release May 2019

$
0
0

We just released the DSC Resource Kit! This release includes updates to 14 DSC resource modules. In the past 6 weeks, 87 pull requests have been merged and 36 issues have been closed, all thanks to our amazing community!

The modules updated in this release are:

  • ActiveDirectoryCSDsc
  • CertificateDsc
  • ComputerManagementDsc
  • NetworkingDsc
  • OfficeOnlineServerDsc
  • PSDscResources
  • SharePointDsc
  • SqlServerDsc
  • StorageDsc
  • xActiveDirectory
  • xDnsServer
  • xFirefox
  • xPSDesiredStateConfiguration
  • xWebAdministration

For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

Our latest community call for the DSC Resource Kit was last Wednesday, May 8. A recording of the call is available here. You can join us for the next call at 12PM (Pacific time) on June 19 to ask questions and give feedback about your experience with the DSC Resource Kit.

The next DSC Resource Kit release will be on Wednesday, June 26.

We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

Please see our documentation here for information on the support of these resource modules.

Included in this Release

You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or CHANGELOG.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

Module Name Version Release Notes
ActiveDirectoryCSDsc 3.3.0.0
  • Remove reference to StorageDsc in README.md – fixes Issue 76.
  • Combined all ActiveDirectoryCSDsc.ResourceHelper module functions into ActiveDirectoryCSDsc.Common module and renamed to ActiveDirectoryCSDsc.CommonHelper module.
  • Opted into Common Tests “Common Tests – Validate Localization” – fixes Issue 82.
CertificateDsc 4.6.0.0
  • CertReq:
    • Added Compare-CertificateIssuer function to checks if the Certificate Issuer matches the CA Root Name.
    • Changed Compare-CertificateSubject function to return false if ReferenceSubject is null.
    • Fixed exception when Certificate with empty Subject exists in Certificate Store – fixes Issue 190.
    • Fixed bug matching existing certificate when Subject Alternate Name is specified and machine language is not en-US – fixes Issue 193.
    • Fixed bug matching existing certificate when Template Name is specified and machine language is not en-US – fixes Issue 193.
    • Changed Import-CertificateEx function to use X509Certificate2Collection instead of X509Certificate2 to support importing certificate chains
ComputerManagementDsc 6.4.0.0
  • ScheduledTask:
    • IdleWaitTimeout returned from Get-TargetResource always null – Fixes Issue 186.
    • Added BuiltInAccount Property to allow running task as one of the build in service accounts – Fixes Issue 130.
  • Refactored module folder structure to move resource to root folder of repository and remove test harness – fixes Issue 188.
  • Added a CODE_OF_CONDUCT.md with the same content as in the README.md and linked to it from README.MD instead.
  • Updated test header for all unit tests to version 1.2.4.
  • Updated test header for all imtegration to version 1.3.3.
  • Enabled example publish to PowerShell Gallery by adding gallery_api environment variable to AppVeyor.yml.
NetworkingDsc 7.2.0.0
  • NetAdapterAdvancedProperty:
    • Added support for RegistryKeyword MaxRxRing1Length and NumRxBuffersSmall – fixes Issue 387.
  • Firewall:
    • Prevent “Parameter set cannot be resolved using the specified named parameters” error when updating rule when group name is specified – fixes Issue 130 and Issue 191.
  • Opted into Common Tests “Common Tests – Validate Localization” – fixes Issue 393.
  • Combined all NetworkingDsc.ResourceHelper module functions into NetworkingDsc.Common module – fixes Issue 394.
  • Renamed all localization strings so that they are detected by “Common Tests – Validate Localization”.
  • Fixed issues with mismatched localization strings.
  • Updated all common functions with the latest versions from DSCResource.Template.
  • Fixed an issue with the helper function Test-IsNanoServer that prevented it to work. Though the helper function is not used, so this issue was not caught until now when unit tests was added.
  • Corrected style violations in NetworkingDsc.Common.
OfficeOnlineServerDsc 1.4.0.0
  • OfficeOnlineServerInstall
    • Updated resource to make sure the Windows Environment variables are loaded into the PowerShell session;
  • OfficeOnlineServerMachine
    • Updated resource to make sure the Windows Environment variables are loaded into the PowerShell session;
  • Created LICENSE file to match the Microsoft Open Source Team standard.
PSDscResources 2.11.0.0
  • Fix Custom DSC Resource Kit PSSA Rule Failures
SharePointDsc 3.4.0.0
  • SPDistributedCacheClientSettings
    • Added 15 new SharePoint 2016 parameters.
  • SPFarm
    • Implemented Null check in Get method to prevent errors
    • Add support to provision Central Administration on HTTPS
  • SPInfoPathFormsServiceConfig
    • Added the AllowEventPropagation parameter.
  • SPInstall
    • Improved logging ouput
    • Updated blocked setup file check to prevent errors when BinaryDir is a CD-ROM drive or mounted ISO
  • SPInstallLanguagePack
    • Improved logging ouput
    • Updated blocked setup file check to prevent errors when BinaryDir is a CD-ROM drive or mounted ISO
  • SPInstallPrereqs
    • Improved logging ouput
    • Added the updated check to unblock setup file if it is blocked because it is coming from a network location. This to prevent endless wait.
    • Added ability to install from a UNC path, by adding server to IE Local Intranet Zone. This will prevent an endless wait caused by security warning.
    • Fixed an issue that would prevent the resource failing a test when the prerequisites have been installed successfully on Windows Server 2019
  • SPManagedMetadataServiceApp
    • Fixed issue where Get-TargetResource method throws an error when the service app proxy does not exist and no proxy name is specified.
  • SPProductUpdate
    • Improved logging ouput
    • Updated blocked setup file check to prevent errors when SetupFile is a CD-ROM drive or mounted ISO
  • SPSearchContent Source
    • Removed check that prevents configuring an incremental schedule when using continuous crawl.
  • SPSitePropertyBag
    • Fixed issue where properties were set on the wrong level.
  • SPSubscriptionSettingsServiceApp
    • Fixed issue where the service app proxy isn’t created when it wasn’t created during initial deployment.
  • SPTrustedRootAuthority
    • Added possibility to get certificate from file.
SqlServerDsc 12.5.0.0
  • Changes to SqlServerSecureConnection
    • Updated README and added example for SqlServerSecureConnection, instructing users to use the “SYSTEM” service account instead of “LocalSystem”.
  • Changes to SqlScript
    • Correctly passes the $VerbosePreference to the helper function Invoke-SqlScript so that PRINT statements is outputted correctly when verbose output is requested, e.g Start-DscConfiguration -Verbose.
    • Added en-US localization (issue 624).
    • Added additional unit tests for code coverage.
  • Changes to SqlScriptQuery
    • Correctly passes the $VerbosePreference to the helper function Invoke-SqlScript so that PRINT statements is outputted correctly when verbose output is requested, e.g Start-DscConfiguration -Verbose.
    • Added en-US localization.
    • Added additional unit tests for code coverage.
  • Changes to SqlSetup
    • Concatenated Robocopy localization strings (issue 694).
    • Made the error message more descriptive when the Set-TargetResource function calls the Test-TargetResource function to verify the desired state.
  • Changes to SqlWaitForAG
  • Changes to SqlServerPermission
  • Changes to SqlServerMemory
    • Added en-US localization (issue 617).
    • No longer will the resource set the MinMemory value if it was provided in a configuration that also set the Ensure parameter to “Absent” (issue 1329).
    • Refactored unit tests to simplify them add add slightly more code coverage.
  • Changes to SqlServerMaxDop
  • Changes to SqlRS
    • Reporting Services are restarted after changing settings, unless $SuppressRestart parameter is set (issue 1331). $SuppressRestart will also prevent Reporting Services restart after initialization.
    • Fixed one of the error handling to use localization, and made the error message more descriptive when the Set-TargetResource function calls the Test-TargetResource function to verify the desired state. This was done prior to adding full en-US localization.
    • Fixed (issue 1258). When initializing Reporting Services, there is no need to execute InitializeReportServer CIM method, since executing SetDatabaseConnection CIM method initializes Reporting Services.
    • issue 864 SqlRs can now initialise SSRS 2017 instances
  • Changes to SqlServerLogin
    • Added en-US localization (issue 615).
    • Added unit tests to improved code coverage.
  • Changes to SqlWindowsFirewall
  • Changes to SqlServerEndpoint
  • Changes to SqlServerEndpointPermission
  • Changes to SqlServerEndpointState
  • Changes to SqlDatabaseRole
  • Changes to SqlDatabaseRecoveryModel
  • Changes to SqlDatabasePermission
  • Changes to SqlDatabaseOwner
  • Changes to SqlDatabase
  • Changes to SqlAGListener
  • Changes to SqlAlwaysOnService
  • Changes to SqlAlias
    • Added en-US localization (issue 602).
    • Removed ShouldProcess for the code, since it has no purpose in a DSC resource (issue 242).
  • Changes to SqlServerReplication
    • Added en-US localization (issue 620).
    • Refactored Get-TargetResource slightly so it provide better verbose messages.
StorageDsc 4.7.0.0
  • DiskAccessPath:
    • Added a Get-Partition to properly handle setting the NoDefaultDriveLetter parameter – fixes Issue 198.
xActiveDirectory 2.26.0.0
  • Changes to xActiveDirectory
    • Added localization module -DscResource.LocalizationHelper* containing the helper functions Get-LocalizedData, New-InvalidArgumentException, New-InvalidOperationException, New-ObjectNotFoundException, and New-InvalidResultException (issue 257). For more information around these helper functions and localization in resources, see Localization section in the Style Guideline.
    • Added common module DscResource.Common containing the helper function Test-DscParameterState. The goal is that all resource common functions are moved to this module (functions that are or can be used by more than one resource) (issue 257).
    • Added xADManagedServiceAccount resource to manage Managed Service Accounts (MSAs). Andrew Wickham (@awickham10) and @kungfu71186
    • Removing the Misc Folder, as it is no longer required.
    • Added xADKDSKey resource to create KDS Root Keys for gMSAs. @kungfu71186
    • Combined DscResource.LocalizationHelper and DscResource.Common Modules into xActiveDirectory.Common
  • Changes to xADReplicationSiteLink
    • Make use of the new localization helper functions.
  • Changes to xAdDomainController
    • Added new parameter to disable or enable the Global Catalog (GC) (issue 75). Eric Foskett @Merto410
    • Fixed a bug with the parameter InstallationMediaPath that it would not be added if it was specified in a configuration. Now the parameter InstallationMediaPath is correctly passed to Install-ADDSDomainController.
    • Refactored the resource with major code cleanup and localization.
    • Updated unit tests to latest unit test template and refactored the tests for the function “Set-TargetResource”.
    • Improved test code coverage.
  • Changes to xADComputer
    • Restoring a computer account from the recycle bin no longer fails if there is more than one object with the same name in the recycle bin. Now it uses the object that was changed last using the property whenChanged (issue 271).
  • Changes to xADGroup
    • Restoring a group from the recycle bin no longer fails if there is more than one object with the same name in the recycle bin. Now it uses the object that was changed last using the property whenChanged (issue 271).
  • Changes to xADOrganizationalUnit
    • Restoring an organizational unit from the recycle bin no longer fails if there is more than one object with the same name in the recycle bin. Now it uses the object that was changed last using the property whenChanged (issue 271).
  • Changes to xADUser
    • Restoring a user from the recycle bin no longer fails if there is more than one object with the same name in the recycle bin. Now it uses the object that was changed last using the property whenChanged (issue 271).
xDnsServer 1.12.0.0
  • Update appveyor.yml to use the default template.
  • Added default template files .codecov.yml, .gitattributes, and .gitignore, and .vscode folder.
  • Added UseRootHint property to xDnsServerForwarder resource.
xFirefox 1.3.0.0
  • Update appveyor.yml to use the default template.
  • Added default template files .codecov.yml, .gitattributes, and .gitignore, and .vscode folder.
  • The module manifest now contains the correct PowerShell version.
  • Added xFirefoxPreference Resource to automate Firefox Preference Configuration
xPSDesiredStateConfiguration 8.7.0.0
  • MSFT_xWindowsProcess:
    • Fixes issue where a process will fail to be created if a $Path is passed that contains one or more spaces, and the resource is using $Credentials.
    • Fixes issue where a process will fail to be created if $Arguments are passed that contain one or more spaces (with or without credentials).
    • Fixes issue where Integration tests fail if empty Arguments are passed. issue 605
    • Heavily refactors MSFT_xWindowsProcess.Integration.Tests.ps1 and adds more Path and Arguments related test cases.
    • Removes reliance on test file WindowsProcessTestProcess.
  • Fixes test failures in xWindowsOptionalFeatureSet.Integration.Tests.ps1 due to accessing the windowsOptionalFeatureName variable before it is assigned. issue 612
  • MSFT_xDSCWebService
    • Fixes issue 536 and starts the deprecation process for configuring a windows firewall (exception) rule using xDSCWebService
    • Fixes issue 463 and fixes some bugs introduced with the new firewall rule handling
xWebAdministration 2.6.0.0
  • Changed order of classes in schema.mof files to workaround 423
  • Fix subject comparison multiple entries for helper function Find-Certificate that could not find the test helper function Install-NewSelfSignedCertificateExScript.
  • Updated unit test for helper function Find-Certificate to check for multiple subject names in different orders.

How to Find Released DSC Resource Modules

To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

Of course, you can also always use PowerShellGet (available starting in WMF 5.0) to find modules with DSC Resources:

#To list all modules that tagged as DSCResourceKit
Find-Module -Tag DSCResourceKit 
#To list all DSC resources from all sources
Find-DscResource

Please note only those modules released by the PowerShell Team are currently considered part of the ‘DSC Resource Kit’ regardless of the presence of the ‘DSC Resource Kit’ tag in the PowerShell Gallery.

To find a specific module, go directly to its URL on the PowerShell Gallery:
http://www.powershellgallery.com/packages/< module name >
For example:
http://www.powershellgallery.com/packages/xWebAdministration

How to Install DSC Resource Modules From the PowerShell Gallery

We recommend that you use PowerShellGet to install DSC resource modules:

Install-Module -Name < module name >

For example:

Install-Module -Name xWebAdministration

To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

Update-Module

After installing modules, you can discover all DSC resources available to your local system with this command:

Get-DscResource

How to Find DSC Resource Modules on GitHub

All resource modules in the DSC Resource Kit are available open-source on GitHub.
You can see the most recent state of a resource module by visiting its GitHub page at:
https://github.com/PowerShell/< module name >
For example, for the CertificateDsc module, go to:
https://github.com/PowerShell/CertificateDsc.

All DSC modules are also listed as submodules of the DscResources repository in the DscResources folder and the xDscResources folder.

How to Contribute

You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
See our contributing guide for more info on how to become a DSC Resource Kit contributor.

If you would like to help, please take a look at the list of open issues for the DscResources repository.
You can also check issues for specific resource modules by going to:
https://github.com/PowerShell/< module name >/issues
For example:
https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

Your help in developing the DSC Resource Kit is invaluable to us!

Questions, comments?

If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

Katie Kragenbrink
Software Engineer
PowerShell DSC Team
@katiedsc (Twitter)
@kwirkykat (GitHub)

The post DSC Resource Kit Release May 2019 appeared first on PowerShell.


PowerShell 7 Roadmap

$
0
0

Last month we announced that PowerShell 7 will be the next release of PowerShell.

Here I will provide more details of areas we’ll be investing in for the PowerShell 7 release.

When will I get it?!

Today, we’re releasing our first preview of PowerShell 7. Keeping with our monthly cadence, expect new preview releases approximately every month.

This first preview contains some of the changes that didn’t make it in time for the 6.2 GA release, and marks our move to .NET Core 3.0. For more details on what’s new, check out our changelog on GitHub.

As mentioned in the PowerShell 7 announcement blog, we will be changing the support life-cycle to align with .NET Core. This means that we expect PowerShell 7 to be generally available (GA) about a month after .NET Core 3.0 GA.

.NET Core 3.0

The biggest immediate change is moving to .NET Core 3.0 (from .NET Core 2.1). Not only are there significant performance improvements, but many new APIs are available including WPF and WinForms (Windows only, though!).

This means that (eventually) we can bring back Out-GridView.

Windows Compatibility

A big focus of PowerShell 7 is making it a viable replacement for Windows PowerShell 5.1. This means it must have near parity with Windows PowerShell in terms of compatibility with modules that ship with Windows.

The PowerShell Team will be working with Windows teams to validate and update their modules to work with PowerShell 7. This also means that to use PowerShell 7 with the breadth of Windows PowerShell modules, you will need to be using the latest builds of Windows 10 (and equivalent Windows Server).

Feature Investigations

We are looking at investing in three specific feature areas. Expect RFCs (Request For Comments specifications) to be published on how we intend to implement these features and the scope of the problem we intended to solve. Feedback is greatly appreciated!

Simplify Secure Credentials Management

Whether you are using PowerShell to automate resources in the cloud, local resources on premise, or a hybrid, you will generally need to have different credentials to access different resources.

The best practice is to never put credentials within your script. So we intend to introduce a way to securely use credentials from a local or remote based credential store.

Logging Off the Box

Part of the security of PowerShell is that it can log everything. However, logging today is purely local onto the machine. For each OS, there are ways to forward those events to a remote system, but it requires different configurations per OS.

We want to introduce a way to easily configure PowerShell through policy to automatically send the logs to a remote target regardless of the OS.

New Version Notification

Looking at our PowerBI dashboard, we see a large number of instances using older versions (some of which are no longer supported). It is important to inform our customers if there is a newer version available that may have security fixes so we need a way to inform the user that a newer version is available politely.

There is already a RFC published for this feature. Please take a look and provide us feedback!

GitHub Issues

We have a number of GitHub issues marked to be considered to be addressed for PowerShell 7. Although we would like to fix everything, we do have limited resources so not every issue marked for consideration will be fixed.

Popular Requested Features

There are a number of requested features we’d like to address in PowerShell 7. Some of these may show up as experimental features so that we can get feedback before we lock in the design.

This list is larger than what we will be able to fix, but these are the ones we’d like to investigate:

Other Investments

My team will also be involved in some other related investments during the time frame of working on PowerShell 7.

PowerShell in Azure Functions to generally available

A few weeks ago, Joey Aiello announced the Public Preview of PowerShell in Azure Functions 2.0. As we get feedback, my team will continue to work with the Azure Functions team to address that feedback and eventually move from public preview to being generally available.

PSReadLine 2.0

Jason Shirk has done a great job with PSReadLine. As part of Windows PowerShell 5, we decided to have it as the default interactive shell experience. As a side project to Jason, he’s made many improvements, but the project is bigger than one person. We’ve agreed to move the PSReadLine project to the PowerShell team where we can have some dedicated resources to get the 2.0.0 release to GA.

PowerShell Editor Services / Visual Studio Code PowerShell extension

We will continue to make progress on PowerShell Editor Services 2.0 release improving reliability, performance, and PSReadLine integration.

PSScriptAnalyzer

As part of improving performance for PSEditorServices, we need to make PSScriptAnalyzer 2.0 host-able so that PS Editor Services can simply call an API rather than calling PSScriptAnalyzer using a PowerShell runspace.

Summary

As you can see, we have a ton of work ahead of us. Not everything will make it in the same time frame of PowerShell 7 and some of the popular requested features may have to wait for PowerShell 7.1, but the more feedback you give us, the more we can be sure we’re doing the right thing to help you succeed.

Thanks!

Steve Lee
Principal Software Engineering Manager
PowerShell Team
https://twitter.com/Steve_MSFT

The post PowerShell 7 Roadmap appeared first on PowerShell.

Using PowerShellGet with Azure Artifacts

$
0
0

We have improved the experience with PowerShellGet and private NuGet feeds by focusing on pain points using an Azure Artifacts feed.
We addressed pain points by enabling/documenting the following features:

  • Non-PAT authentication for package management
  • Credential persistence in Register-PSRepository

These improvements will effect the following cmdlets:

  • Register-PSRepository
  • Set-PSRepository
  • Find-Module/Script
  • Install-Module/Script
  • Update-Module/Script
  • Save-Module/Script
  • Publish-Module/Script

What is Azure Artifacts and Why would I use it?

Azure Artifacts is an Azure DevOps service which introduces the concept of multiple feeds that you can use to organize and control access to your packages. In other words it is a place for storing and sharing packages with controlled access through Azure DevOps. A common use scenario for Azure Artifacts with PowerShellGet is for organizations which need a controlled access feed for sharing their private internal packages and vetted external packages within their organization. Package owners may also want to use Azure Artifacts as part of their CI/CD pipeline in Azure DevOps. For more information on Azure Artifacts, check out their documentation.

Getting started with Azure Artifacts with PowerShellGet

Since these fixes were introduced into PackageManagement, verify you have at least version 1.4.2 of the PackageManagement module
to do this run Get-InstalledModule PackageManagement . If you do not have this version 1.4.2 or higher run the command
Update-Module PackageManagement and then refresh your PowerShell session. You should also ensure you have an up to date version of PowerShellGet.

The next step is to create an Azure Artifacts feed, since Azure Artifacts is an Azure DevOps service you will need to create an Azure DevOps account if you don’t already have one.

Once you gave an account you can create an Azure Artifacts feed. To do this, follow these steps. Return here for instructions on how to connect to the feed and publish packages.

The other component you will need is the Azure Artifacts credential provider. The credential provider comes pre-installed with Visual studio, so if you have VS 15.9, you don’t need to install anything. Otherwise the steps for installing the credential provider, which are platform dependent are provided here.

To register your feed as a PSRepository you will need a name, source location, and
publish location. The name is what you will call the PSRepository and can be anything you chose
in this example we call it “myAzArtifactsRepo”. Your source location, and publish location will be the same uri and will be the
format: “https://pkgs.dev.azure.com/’yourorganizationname’/_packaging/’yourfeedname’/nuget/v2“.

You are now ready to register your Az DevOps feed as a PSRepository using the
following command:

Register-PsRepository myAzArtifactsRepo -SourceLocation  "https://pkgs.dev.azure.com/'yourorganizationname'/_packaging/'yourfeedname'/nuget/v2" -PublishLocation "https://pkgs.dev.azure.com/'yourorganizationname'/_packaging/'yourfeedname'/nuget/v2"

When you run this command you will be prompted with a device flow url which will allow you to authenticate the repository. In general you have three main options for authentication:

    1. Register the repository without the
      -Credential
        parameter [see above] and use the device flow url.
    2. Explicitly provide a credential. To do this use the
      -Credential
        parameter when you register the PSRepository and provide a personal access token (PAT). For more information on how to get a PAT check out the documentation. Note that if you chose this method the credentials will not be cached.
    3. Configure an environment variable with your credentials. To do set the
      VSS_NUGET_EXTERNAL_FEED_ENDPOINTS
       variable to
      {"endpointCredentials": [{"endpoint":"https://pkgs.dev.azure.com/'yourorganizationname'/_packaging/'yourfeedname'/nuget/v2", "username":"yourusername", "password":"accesstoken"}]}

      Note may need to restart the agent service or the computer before the environment variables are available to the agent. For assistance with this step check out the Artifacts Credential Provider GitHub.  

Let’s publish a PowerShell Gallery package to our Azure DevOps feed.
To do this, you need to first save the module then, publish it using the following commands:

<code>Save-Module -Name SHiPS -Repository PSGallery -Path '.'
Publish-Module -path "path\to\module"\SHiPS -Repository myAzArtifactsRepo -NuGetApiKey <key-- any arbitrary string>
</code>

Now that we have some packages let’s find and install them:

Find-module  -name SHiPS -Repository myAzArtifactsRepo

Install-Module -name SHiPS -Repository myAzArtifactsRepo

Now that you can manage packages on your feed, you may want to share it with other users. To manage the access to your feed use the feed settings in Azure Artifacts. For more information on this check out the documentation.

Getting Feedback

If you encounter any issues we would love to hear about it on our GitHub page.
Please file an issue letting us know what we can do to make your experience better.

The post Using PowerShellGet with Azure Artifacts appeared first on PowerShell.

Azure Policy Guest Configuration – Client

$
0
0

This post builds upon the introduction published earlier to the PowerShell blog. In this post we are going to explore the Azure Policy Guest Configuration client and how configuration content is consumed. 

The full documentation for this service is available at the following short url. 

https://aka.ms/gcpol 

DSC service/daemon 

Inside Azure Policy Guest Configuration, you will find the new DSC engine as part of the extension for virtual machines. You can see this using the Run Command feature in Azure for any virtual machine that is being audited by Azure Policy using one of the Guest Configuration initiatives. 

The structure of the agent folders is the same for both operating systems. You will find a folder named DSC that contains the binaries for the engine, a folder named logs containing logs generated by the engine, and a subfolder named downloads that is used to support additional requirements. 

Here are some example commands you can use to take a look at the DSC engine yourself. 

Windows 

The Guest Configuration extension for Windows has not been published to GitHub yet.

List the DSC binaries in Guest Configuration in Windows 

Command (note the version is current at this time but could change): 

ls C:\Packages\Plugins\Microsoft.GuestConfiguration.ConfigurationforWindows\1.13.0.0\dsc\DSC\dsc_* 

Result: 

    Directory: C:\Packages\Plugins\Microsoft.GuestConfiguration.Configurationfo
    rWindows\1.13.0.0\dsc\DSC

Mode                LastWriteTime         Length Name                          
----                -------------         ------ ----                          
-a----         5/1/2019   1:16 PM         622592 dsc_client.exe                
-a----         5/1/2019   1:14 PM         532992 dsc_diagnostics.dll           
-a----         5/1/2019   1:14 PM         906752 dsc_infrastructure.dll        
-a----         5/1/2019   1:16 PM        1840640 dsc_pull_client.dll           
-a----         5/1/2019   1:17 PM         360960 dsc_reporting.dll             
-a----         5/1/2019   1:28 PM        1252864 dsc_rest_server.dll           
-a----         5/1/2019   1:28 PM         393216 dsc_service.exe               
-a----         5/1/2019   1:26 PM         956928 dsc_timer.dll   

You will also find an instance of pwsh.exe and supporting files in this same (version specific) folder. That is because Guest Configuration includes the portable installation of PowerShell Core so there is no need to manage PowerShell versions for the system. 

View the details of the DSC service in Guest Configuration in Windows 

Command: 

Get-Service DscService | fl * 

Result: 

Name                : DscService
RequiredServices    : {}
CanPauseAndContinue : False
CanShutdown         : False
CanStop             : True
DisplayName         : Guest Configuration Service
DependentServices   : {}
MachineName         : .
ServiceName         : DscService
ServicesDependedOn  : {}
ServiceHandle       : SafeServiceHandle
Status              : Running
ServiceType         : Win32OwnProcess
StartType           : Automatic
Site                :
Container           : 

Linux 

The Guest Configuration extension for Linux is published in GitHub here. 

List the DSC binaries in Guest Configuration in Linux 

Command (note the version is current at this time but could change): 

ls /var/lib/waagent/Microsoft.GuestConfiguration.ConfigurationforLinux-1.11.0/GCAgent/DSC/dsc_* 

Result: 

[stdout]
/var/lib/waagent/Microsoft.GuestConfiguration.ConfigurationforLinux-1.11.0/GCAgent/DSC/dsc_client
/var/lib/waagent/Microsoft.GuestConfiguration.ConfigurationforLinux-1.11.0/GCAgent/DSC/dsc_linux_service 

Currently, Guest Configuration in Linux is only supporting content in the format of Chef InSpec profiles. We expect this to soon open to PowerShell-based resources and other tool formats. 

View the details of the DSC service in Guest Configuration in Linux 

Command: 

systemctl status dscd.service 

Result: 

  • dscd.service- DSC Service
       Loaded: loaded (/lib/systemd/system/dscd.service; enabled; vendor preset: enabled)
       Active: active (running) since Tue 2019-06-04 18:13:28 UTC; 1h 48min ago
    Main PID: 22327 (dsc_linux_servi)
        Tasks: 42
       Memory: 11.9M
          CPU: 1.483s
       CGroup: /system.slice/dscd.service
               └─22327 /var/lib/waagent/Microsoft.GuestConfiguration.ConfigurationforLinux-1.11.0/GCAgent/DSC/dsc_linux_serviceJun 04 18:13:28 linux systemd[1]: Started DSC Service.
    Jun 04 18:13:28 linux dsc_linux_service[22327]: Running DSC rest server... 

Configurations 

Configurations in Guest Configuration are managed in a whole new way. There is no longer a need for partial configurations because many configurations can be managed independently. 

Please keep in mind that currently, configurations are used only for auditing settings and not enforcing the configuration. 

The model for Guest Configuration takes lessons learned from both the DSC Extension and State Configuration service. The Guest Configuration service does not require or support uploading and storing assets in the service, or a compilation service. Configurations are packaged in .zip format as they were for DSC Extension. A Guest Assignment in the Guest Configuration resource provider includes a reference to the location of the package, a hash value of the package file, and a table of parameters to be passed to the engine when the configuration is executed. 

For content provided by Microsoft, the configuration content managed and replicated globally. When the package is downloaded to the machine, it is decompressed and extracted to a local folder. 

Each folder contains everything needed for DSC to manage the configuration including the mof, any resources required, and the metaconfiguration to use for that configuration. This means the mode, frequency, and other LCM settings can be unique per configuration. 

View configuration content in Windows 

In this test case, the server is in scope of multiple audit policies. 

Command: 

ls c:\programdata\GuestConfig\Configuration\ 

Result: 

 Directory: C:\programdata\GuestConfig\Configuration

Mode                LastWriteTime         Length Name                          
----                -------------         ------ ----                          
d-----         6/5/2019   1:03 PM                WindowsDefenderExploitGuard   
d-----         6/5/2019   1:03 PM                WindowsDscConfiguration       
d-----         6/5/2019   1:03 PM                windowsfirewallenabled        
d-----         6/5/2019   1:03 PM                WindowsLogAnalyticsAgentConnection                          
d-----         6/5/2019   1:03 PM                WindowsPendingReboot          
d-----         6/5/2019   1:04 PM                WindowsPowerShellModules      
d-----         6/5/2019   1:04 PM                WindowsTimeZone                

View configuration content in Linux 

In this test case, the server is in scope of only one audit policy. 

Command: 

ls /var/lib/GuestConfig/Configuration 

Result: 

[stdout]
firewalldenabled 

Logs 

Log output is available on each node within the agent folder named Logs. This can also be returned using Run Command, however the output is limited to the last 4096 bytes so it is best to filter the logs to only what you are looking for. Examples approaches are given below. 

View error messages in Windows logs 

Command (note the version is current at this time but could change): 

Select-String -Path 'C:\Packages\Plugins\Microsoft.GuestConfiguration.ConfigurationforWindows\1.13.0.0\dsc\logs\dsc.log' -pattern 'DSC*Engine' -CaseSensitive -Context 0,5 

Result (this is a short snippet of the actual output): 

 [INFO] [00000000-0000-0000-0000-000000000000] Job 
3af0538a-35f4-415f-b5b8-70ae3099e6a2 : Operation Get-DscConfiguration 
completed successfully. 

View error messages in Linux logs 

Command (note the version is current at this time but could change): 

grep -A 5 'DSC*Engine' '/var/lib/waagent/Microsoft.GuestConfiguration.ConfigurationforLinux-1.11.0/GCAgent/logs/dsc.log' 

Result (this is a short snippet of the actual output): 

 [2019-06-05 18:14:22.772] [PID 30775] [TID 30824] [DSCEngine] [INFO] [00000000-0000-0000-0000-000000000000] Job 6ae51953-24aa-44e8-8abb-4ec522cc5b1f : Method CU_TestConfiguration ended successfully 

Thank you!
Michael Greene
Principal Program Manger
Microsoft Azure
@migreene 

 

The post Azure Policy Guest Configuration – Client appeared first on PowerShell.

Azure Policy Guest Configuration – Service

$
0
0

This post builds upon the introduction published earlier to the PowerShell blog. In this post we are going to explore the Azure Policy Guest Configuration service. 

The full documentation for this service is available at the following short url. 

https://aka.ms/gcpol

Resource provider 

At a fundamental level, this solution includes a new resource provider in Azure named “Microsoft.GuestConfiguration” and new virtual machine extensions named “Microsoft.GuestConfiguration.GuestConfigurationforLinux” and “Microsoft.GuestConfiguration.GuestConfigurationforWindows”. 

The new engine is delivered to the virtual machine by the extension. When the service/daemon starts, it queries the service to see if there are any jobs for the virtual machine. If so, it downloads the content (DSC mof/modules, packaged in the same way as DSC extension), performs the work, and reports status. Behind the scenes, microservices and big data platforms ensure data remains geographically local. 

The following diagram demonstrates the flow of requests as a Guest Assignment is published, the list of assignments are requested from the VM, the VM downloads and runs the configuration, status is returned to a regional location, and summary information is presented to Azure Policy. 

Diagram: 

 

You can think of the resource provider as surfacing VM scenarios in to Azure Resource Manager API. If you require that only one group of users should have administrative privilege inside a server, you can express that requirement as a Guest Assignment in Azure Resource Manager. This is just a reference to a configuration that checks (“Test”) whether only that group has the intended access and return who currently has access (“Get”). The scenario is given as a property of the virtual machine through a provider resource “Microsoft.GuestConfiguration”. 

Here is an example of that in code: 

{
                    "apiVersion": "2018-11-20",
                    "type": "Microsoft.Compute/virtualMachines/providers/guestConfigurationAssignments",
                    "name": "[concat(parameters('vmName'), '/Microsoft.GuestConfiguration/', parameters('configurationName'))]",
                    "location": "[parameters('location')]",
                    "properties": {
                      "guestConfiguration": {
                        "name": "[parameters('configurationName')]",
                        "version": "1.*",
                        "configurationParameter": [
                          {
                            "name": "[LocalGroup]AdministratorsGroup;Members",
                            "value": "[parameters('Members')]"
                          }
                        ]
                      }
                    }
                  }, 

Using built-in policies 

One of our learnings from DSC has been to reduce complexity. We are aiming for a solution that you can just enable and immediately see value. A good example of this approach was Active Directory Group Policy. While Group Policy presented challenges for developers looking to rapidly iterate between builds, the concept of just picking the settings you need and turning them on has been popular with large enterprises. 

Our current list of built-in content is below.  This is growing nearly every week.  You can view this list in the Azure Portal by opening Policy and clicking Definitions, then changing the ‘Type’ filter to ‘Initiative’ and the ‘Category’ filter to ‘Guest Configuration’.  You can also run the following command to get a current list using PowerShell with the Az cmdlet. 

Get-AzPolicySetDefinition -Builtin | ? {$_.Properties.Metadata.Category -eq "Guest Configuration"} | % {$_.Properties.DisplayName} 

NOTE: it is important when assigning these policies to use the Initiative.  The DeployIfNotExists policy loads the VM extension, which is a requirement for Audit/AuditIfNotExists policies in Guest Configuration to work properly. 

Policy initiative display name 
Audit Windows VMs in which the Administrators group does not contain only the specified members 
[Preview]: Audit Windows VMs on which the Log Analytics agent is not connected as expected 
Audit Windows VMs in which the Administrators group does not contain all of the specified members 
Audit Windows VMs that do not have the specified applications installed 
[Preview]: Audit VMs with insecure password security settings 
Audit Windows VMs that are not set to the specified time zone 
Audit Windows VMs that are not joined to the specified domain 
Audit Windows web servers that are not using secure communication protocols 
[Preview]: Audit Windows VMs on which Windows Defender Exploit Guard is not enabled 
Audit Windows Server VMs on which Windows Serial Console is not enabled 
Audit Windows VMs in which the Administrators group contains any of the specified members 
[Preview]: Audit Windows VMs that contain certificates expiring within the specified number of days 
[Preview]: Audit Windows VMs that have not restarted within the specified number of days 
[Preview]: Audit Windows VMs on which the DSC configuration is not compliant 
Audit Linux VMs that do not have the specified applications installed 
Audit Windows VMs with a pending reboot 
Audit Windows VMs that do not have the specified Windows PowerShell modules installed 
[Preview]: Audit Windows VMs that do not contain the specified certificates in Trusted Root 
Audit Windows VMs that have the specified applications installed 
Audit Linux VMs that have the specified applications installed 

Viewing results 

You can view the results of the policy in Azure Portal (as described hereand you also can get results using the cmdlets provided by a new module named Az.GuestConfiguration. A step by step gudie for using these cmdlets is available here. 

The key scenario for the cmdlets is to use the Get-AzVmGuestPolicyStatusHistory cmdlet with the -ShowOnlyChange parameter. This will tell you every time a VM was out of compliance over the reporting period and why. 

The Az Guest Configuration cmdlets are documented here. 

You can also directly query the Azure Policy Guest Configuration REST API to see the results of your audits. This includes the “Compliance Reasons” data the returns the raw information from the tool used to perform the audit. For Windows this data includes the name of the DSC resource used to run Test and Get, and the data returned. For Linux this includes the fully formatted output from InSpec. For both platforms, we intend to be open and extensible going forward. 

The API for getting compliance details is documented here. 

Thank you!
Michael Greene
Principal Program Manger
Microsoft Azure
@migreene 

The post Azure Policy Guest Configuration – Service appeared first on PowerShell.

DSC Planning Update – June 2019

$
0
0

It has been almost a year since the last DSC Planning update. There has been a lot going on, many decisions being made, and it just didn’t make sense to post earlier in this calendar year. In this post we will review what has been shipped and the high-level direction we are heading. 

I am accompanying this post with write-ups that are for the more technical audience. In two parts, I would like to explain the implementation of the Guest Configuration client/service and exactly how the new DSC engine functions. 

If you take nothing else away, here are the top-level items: 

  • The new implementation of DSC is Azure Policy Guest Configuration 
  • The solution is GA for built-in content and is moving towards a preview for custom content 
  • Your skill set and your DSC scripts/modules can be used in a new way 

Azure Policy Guest Configuration 

Previously we have referred to the new DSC codebase under different names. DSC Core and the new LCM. We also disclosed that the platform would be used in Azure Policy Guest Configuration. 

What have we shipped? 

The DSC codebase we have been working on is now fully GA as Azure Policy Guest Configuration but this is not the DSC you have known up to this point. It is best to think of Azure Policy Guest Configuration as based on the DSC syntax but functionally a new platform. 

The intention for this service is to build confidence so application developers/owners are free to deploy servers when they need them without putting the organization at risk. Building this platform on a tool that was designed with operations in mind helps us to look beyond the types of settings that we thought about in platforms such as Group Policy. We can include operational requirements such as making sure all servers have a healthy monitoring agent, logging configuration, and the correct certificates in place to function in an enterprise environment. 

DSC has been the basis for other Azure solutions such as the Azure DSC Extension and Azure Automation State Configuration, that help you to configure virtual machines. Azure Policy Guest Configuration currently provides an audit platform to validate settings inside virtual machines. 

The full documentation for this service is available at the following short url. 

https://aka.ms/gcpol 

If you would like to continue reading about how this service is technically implemented, the two technical write-ups are published to accompany this post. 

Azure Policy Guest Configuration – Service

Azure Policy Guest Configuration – Client

High level direction forward 

For the next semester (the second half of 2019 calendar year) we are focused on iterating upon our first release of this solution, introducing the ability for you to use your own content for auditing machines, and to enable you to also enforce settings inside virtual machines using Azure Policy. 

It is important for many people to understand what the options will be to use DSC in disconnected scenarios going forward.  We are considering our options in this area and taking the feedback seriously.  I hope to have more to share on this area in the future. 

Iterating upon our first release of the solution includes multiple areas where we believe we can make life easier for customers. One of the patterns we have observed is customers assigning an audit policy but forgetting to assign the policy that handles automatically onboarding servers.  In the future we believe we can make this simpler.  We have also heard from customers that they would like to have the option to bulk export data about virtual machine compliance so it can be used in other tools, and that they would like to use the solution to audit servers running outside of Azure. 

We hope to enable customers to use their own content, and the tools of their choice, when auditing settings inside virtual machines. As an example, we have heard from Chef customers that they would like to be able to use InSpec to audit Windows Servers. As a result, we announced in our session at Chef Conference that we will be co-maintaining a Guest Configuration provider for InSpec as a collaborative open source project that customers can use in Azure Policy Guest Configuration. More information can be found here. 

We are investing in getting the user experience right for developing custom content, cross platform for the developer workstation, and having a validation and troubleshooting experience that improves on lessons we learned with DSC. We will soon be moving into a public preview of custom content. In the meantime, you are welcome to give us feedback in our request for comments public GitHub repo here. 

Finally, we are investigating the right approaches for enforcing settings inside virtual machines using Azure Policy. With this scope in mind, I would like to invite you to respond to a (an anonymous) survey to provide feedback on your top requirements. 

Survey link 

Thank you!
Michael Greene
Principal Program Manger
Microsoft Azure
@migreene 

The post DSC Planning Update – June 2019 appeared first on PowerShell.

Release of PowerShell Script Analyzer 1.18.1

$
0
0

Overview

PSScriptAnalyzer (PSSA1.18.1 is now available on the PSGallery and fixes not only a lot of the issues reported for 1.18.0 but has also been made twice as faster compared to 1.18.0. Additionally, the -SaveDscDependency switch on Invoke-ScriptAnalyzerhas been improved to be platform agnostic and should now also work on Linux systems if DSC has been set up. A long standing concurrency bug related to analysing module manifest has also been fixed. Analysis showed that Test-ModuleManifest is not thread-safe due to a bug either in the cmdlet or in the PowerShell engine itself, we resolved it by having a lock around calls to this cmdlet.

Formatter Fixes

This applies especially to its usage within the VS Code PowerShell extension:

  • The new PSUseCorrectCasing formatting rule had to be adjusted to not expand/change paths and to treat wildcard characters correctly. Under the hood the rule calls Get-Command and because Get-Command ? returns all commands that have a name of length 1, it returned ForEach-Object first, which made PSSA incorrectly change the ? alias for Where-Object to ForEach-Object. The PowerShell VS Code extension has the powershell.codeFormatting.useCorrectCasing setting that wraps around this configuration and the setting is currently defaulting to false due to those issues that were found. With PSSA 1.18.1, we’d encourage you to enable the setting again as we think that we have fixed all issues and pending feedback we plan to enable the setting by default. Although the VS Code extension ships a backup version of PSSA (currently 1.18.0), one can always install PSScriptAnalyzer locally and the extension will pick it up. You can install the newer PSScriptAnalyzer version and start using it without having to wait for the extension to release an update.
  • The new PipelineIndentation configuration setting of the PSUseConsistentIndentation formatting rule had a bug when it was set to IncreaseIndentationForFirstPipeline or IncreaseIndentationAfterEveryPipeline and in certain cases, indentation of code following the pipeline could be incorrectly indented. Currently the VS Code setting powershell.codeFormatting.pipelineIndentationStyle for it is set to NoIndentation to avoid this bug. We encourage you here as well to try out the options again so that we can get feedback before we set the default of the VS Code setting to IncreaseIndentationForFirstPipeline (which is the default when calling Invoke-Formatter without parameters). This desired default was voted for by the community here.

Conclusion and Future Outlook

Please try out this new patch, if you install it using Install-Module then the VS Code extension will automatically use it after a restart of the integrated terminal session or just by re-opening VS Code. Getting feedback in this period is very important so that the PowerShell team can make a decision on when to include 1.18.1 by default in one of the next updates of the PowerShell extension. After feedback of this phased rollout, we will consider changing the default settings in the extension as mentioned above. It is hard to anticipate all the use cases, so we chose to make features configurable behind new flags and rollout the changes to a smaller user group first.

The Changelog has more details if you want to dig further.

On behalf of the Script Analyzer team,

Christoph Bergmeister, Project Maintainer from the community
Jim Truher, Senior Software Engineer, Microsoft

The post Release of PowerShell Script Analyzer 1.18.1 appeared first on PowerShell.

Announcing General Availability of the Windows Compatibility Module 1.0.0

$
0
0

The Windows Compatibility module (WindowsCompatibility) is a PowerShell module that lets PowerShell Core 6 scripts access Windows PowerShell modules that are not yet natively available on PowerShell Core. (Note: the list of unavailable commands is getting smaller with each new release of PowerShell Core. This module is just for things aren’t natively supported yet.)

You can install the module from the PowerShell Gallery using the command

Install-Module WindowsCompatibility

and the source code is available on GitHub. (This is where you should open issues or make suggestions.)

Once you have WindowsCompatibility installed, you can start using it. The first thing you might want to run is Get-WinModule which will show you the list of available modules. From that list, choose a module, say PKI and and load it. To do this, run the following command:

Import-WinModule PKI

and you’ll have the commands exported by the PKI module in your local session. You can run them just like any other command. For example:

New-SelfSignedCertificate -DnsName localhost

As always, you can see what a module exported by doing:

Get-Command -module PKI

just like any other module.

These are the most important commands but the WindowsCompatibility module provides some others:

  • Invoke-WinCommand allows you to invokes a one-time command in the compatibility session.
  • Add-WinFunction allows you to define new functions that operate implicitly in the compatibility session.
  • Compare-WinModule lets you compare what you have against what’s available.
  • Copy-WinModule will let you copy Window PowerShell modules that are known to work in PowerShell 6 to the PowerShell 6 command path.
  • Initialize-WinSession gives you more control on where and how the compatibility session is created. For example. it will allow you to place the compatibility session on another machine.

(See the module’s command help for more details and examples on how to use the WindowsCompatibility functions.)

How It Works

The WindowsCompatibility module takes advantage of the ‘Implicit Remoting‘ feature that has been available in PowerShell since version 2. Implicit remoting works by retrieving command metadata from a remote session and synthesizing proxy functions in the local session. When you call one of these proxy function, it takes all of the parameters passed to it and forwards them to the real command in the “remote” session. Wait a minute you may be thinking – what does remoting have to do with the WindowsCompatibility module? WindowsCompatibility automatically creates and manages a ‘local remote’ session, called the ‘compatibility session’ that runs with Windows PowerShell on the local machine. It imports the specified module and then creates local proxy functions for all of commands defined in that module.

OK – what about modules that exist in both Windows PowerShell and PowerShell core? Yes – you can import them. After all, there are still a fair number of base cmdlets that aren’t available in PowerShell core yet.

So how does this work? WindowsCompatibility is very careful to not overwrite native PowerShell core commands. It only imports the ones that are available with Windows PowerShell but not with PowerShell Core. For example, the following will import the PowerShell default management module

 Import-WinModule  Microsoft.PowerShell.Management

which contains, among others, the Get-EventLog cmdlet. None of the native PowerShell Core cmdlets get overwritten but now you have Get-EventLog available in your session.

At this point, if you call Get-Module, you will see something a bit strange:

Get-Module | ForEach-Object Name

results in output that looks like:

Microsoft.PowerShell.Management
Microsoft.PowerShell.Management.WinModule
Microsoft.PowerShell.Utility
NetTCPIP

Import-WinModule renames the compatibility module at load time to prevent collisions with identically named modules. This is so the module qualified commands will resolve against the current module. In fact, if you want to see what additional commands were imported, you can run:

Get-Command -Module  Microsoft.PowerShell.Management.WinModule

Limitations

Because WindowsCompatibility is based on implicit remoting, there are a number of significant limitations on the cmdlets imported by the module. First, because everything is done using the remoting protocol, the imported cmdlets will return deserialized objects that only contain properties. Much of the time, this won’t matter because the parameter binder binds by property name rather than by object type. As long as the required properties are present on the object, it doesn’t matter what type the object actually is. There are, however, cases where the cmdlet actually requires that the object be of a specific type or that it have methods. WindowsCompatibility won’t work for these cmdlets.

Windows Forms and other graphical tools

The remoting session is considered non-interactive so graphical tools such as notepad or Winforms scripts will either fail, or worse hang.

Linux and Mac support

This module depends on WinRM and the client libraries on these platforms are known to be unstable and limited. So for this release, only PowerShell Core running on Windows is supported. (This may change in the future. But you’ll still need a Windows machine with Windows PowerShell to host the compatibility session.)

PowerShell 6.1 Dependency

WindowsCompatibility depends on a feature introduced in PowerShell Core 6.1 for keeping the current working directory in both the local and compatibility sessions synchronized. Earlier versions of PowerShell will work with WindowsCompatibility but won’t have this directory synchronization feature. So if you’re running PowerShell Core 6.0, import a command that writes to files, do Set-Location to a new directory, then use that command to write to a file with an unqualified path; it will use the original path from when the module was imported rather than your sessions current working directory. On PowerShell Core 6.1, it will correctly use the current working directory.

Summary

To sum it all up, the WindowsCompatibility module provides a set of commands that allow you to access Window PowerShell modules from PowerShell Core 6. There are however, some limitations that make it unsuitable for all scenarios. Over time, as more and more modules are ported to .NET Core/PowerShell 6 natively there will be less need for this module.

Cheers!
Bruce Payette,
PowerShell Team.


PowerShell Constrained Language mode and the Dot-Source Operator

$
0
0

PowerShell Constrained Language mode and the Dot-Source Operator

PowerShell works with application control systems, such as AppLocker and Windows Defender Application Control (WDAC), by automatically running in
ConstrainedLanguage mode. ConstrainedLanguage mode restricts some exploitable aspects of PowerShell while still giving you a rich shell to run commands and scripts in. This is different from usual application white listing rules, where an application is either allowed to run or not.

But there are times when the full power of PowerShell is needed, so we allow script files to run in FullLanguage mode when they are trusted by the policy. Trust can be indicated through file signing or other policy mechanisms such as file hash. However, script typed into the interactive shell is always run constrained.

Since PowerShell can run script in both Full and Constrained language modes, we need to protect the boundary between them. We don’t want to leak variables or functions between sessions running in different language modes.

The PowerShell dot-source operator brings script files into the current session scope. It is a way to reuse script. All script functions and variables defined in the script file become part of the script it is dot sourced into. It is like copying and pasting text from the script file directly into your script.

# HelperFn1, HelperFn2 are defined in HelperFunctions.ps1
# Dot-source the file here to get access to them (no need to copy/paste)
. c:\Scripts\HelperFunctions.ps1
HelperFn1
HelperFn2

This presents a problem when language modes are in effect with system application control. If an untrusted script is dot-sourced into a script with full trust then it has access to all those functions that run in FullLanguage mode, which can result in application control bypass through arbitrary code execution or privilege escalation. Consequently, PowerShell prevents this by throwing an error when dot-sourcing is attempted across language modes.

Example 1:

System is in WDAC policy lock down. To start with, neither script is trusted and so both run in ConstrainedLanguage mode. But the HelperFn1 function uses method invocation which isn’t allowed in that mode.

PS> type c:\MyScript.ps1
Write-Output "Dot sourcing MyHelper.ps1 script file"
. c:\MyHelper.ps1
HelperFn1
PS> type c:\MyHelper.ps1
function HelperFn1
{
    "Language mode: $($ExecutionContext.SessionState.LanguageMode)"
    [System.Console]::WriteLine("This can only run in FullLanguage mode!")
}
PS> c:\MyScript.ps1
Dot sourcing MyHelper.ps1 script file
Language mode: ConstrainedLanguage
Cannot invoke method. Method invocation is supported only on core types in this language mode.
At C:\MyHelper.ps1:4 char:5
+     [System.Console]::WriteLine("This cannot run in ConstrainedLangua ...
+     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : MethodInvocationNotSupportedInConstrainedLanguage

Both scripts are untrusted and run in ConstrainedLanguage mode, so dot-sourcing the MyHelper.ps1 file works. However, the HelperFn1 function performs method invocation that is not allowed in ConstrainedLanguage and fails when run. MyHelper.ps1 needs to be signed as trusted so it can run at FullLanguage.

Next we have mixed language modes. MyHelper.ps1 is signed and trusted, but MyScript.ps1 is not.

PS> c:\MyScript.ps1
Dot sourcing MyHelper.ps1 script file
C:\MyHelper.ps1 : Cannot dot-source this command because it was defined in a different language mode. To invoke this command without importing its contents, omit the '.' operator.
At C:\MyScript.ps1:2 char:1
+ . 'c:\MyHelper.ps1'
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [MyHelper.ps1], NotSupportedException
    + FullyQualifiedErrorId : DotSourceNotSupported,MyHelper.ps1
...

And we get a dot-source error because we are trying to dot-source script that has a different language mode than the session it is being dot-sourced into.

Finally, we sign as trusted both script files and everything works.

PS> c:\MyScript.ps1
Dot sourcing MyHelper.ps1 script file
Language mode: FullLanguage
This can only run in FullLanguage mode!

The lesson here is to ensure all script components run in the same language mode on policy locked down systems. If one component must run in FullLanguage mode, then all components should run in FullLanguage mode. This means validating that each component is safe to run in FullLanguage and indicating they are trusted to the application control policy.

So this solves all language mode problems, right? If FullLanguage is not needed then just ensure all script components run untrusted, which is the default condition. If they require FullLanguage then carefully validate all components and mark them as trusted. Unfortuantely, there is one case where this best practice doesn’t work.

PowerShell Profile File

The PowerShell profile file (profile.ps1) is loaded and run at PowerShell start up. If that script requires FullLanguage mode on policy lock down systems, you just validate and sign the file as trusted, right?

Example 2:

PS> type c:\users\<user>\Documents\WindowsPowerShell\profile.ps1
Write-Output "Running Profile"
[System.Console]::WriteLine("This can only run in FullLanguage!")
# Sign file so it is trusted and will run in FullLanguage mode
PS> Set-AuthenticodeSignature -FilePath .\Profile.ps1 -Certificate $myPolicyCert
# Start a new PowerShell session and run the profile script
PS> powershell.exe
Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.
C:\Users\<user>\Documents\WindowsPowerShell\profile.ps1 : Cannot dot-source this command because it was defined in a different language mode. To invoke this command without importing its contents, omit the '.' operator.
At line:1 char:1
+ . 'C:\Users\<user>\Documents\WindowsPowerShell\profile.ps1'
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [profile.ps1], NotSupportedException
    + FullyQualifiedErrorId : DotSourceNotSupported,profile.ps1

What gives? The profile.ps1 file was signed and is policy trusted. Why the error?
Well, the issue is that PowerShell dot-sources the profile.ps1 file into the default PowerShell session, which must run in ConstrainedLanguage because of the policy. So we are attempting to dot-source a FullLanguage script into a ConstrainedLanguage session, and that is not allowed. This is a catch 22 because if the profile.ps1 is not signed, it may not run if it needs FullLanguage privileges (e.g., invoke methods). But if you sign it, it still won’t run because of how it is dot-sourced into the current ConstrainedLanguage interactive session.

Unfortunately, the only solution is to keep the profile.ps1 file fairly simple so that it does not need FullLanguage, and refrain from making it trusted. Keep in mind that this is only an issue when running with application control policy. Otherwise, language modes do not come into play and PowerShell profile files run normally.

Paul Higinbotham
Senior Software Engineer
PowerShell Team

DSC Resource Kit Release November 2018

$
0
0

We just released the DSC Resource Kit!

This release includes updates to 9 DSC resource modules. In the past 6 weeks, 61 pull requests have been merged and 67 issues have been closed, all thanks to our amazing community!

The modules updated in this release are:

  • AuditPolicyDsc
  • DFSDsc
  • NetworkingDsc
  • SecurityPolicyDsc
  • SharePointDsc
  • StorageDsc
  • xBitlocker
  • xExchange
  • xHyper-V

For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

Our latest community call for the DSC Resource Kit was supposed to be today, November 28, but the public link to the call expired, so the call was cancelled. I will update the link for next time. If there is interest in rescheduling this call, the new call time will be announced on Twitter (@katiedsc or @migreene) The call for the next release cycle is also getting moved a week later than usual to January 9 at 12PM (Pacific standard time). Join us to ask questions and give feedback about your experience with the DSC Resource Kit.

The next DSC Resource Kit release will be on Wednesday, January 9.

We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

Please see our documentation here for information on the support of these resource modules.

Included in this Release

You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or CHANGELOG.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

Module Name Version Release Notes
AuditPolicyDsc 1.3.0.0
  • Update LICENSE file to match the Microsoft Open Source Team standard.
  • Added the AuditPolicyGuid resource.
DFSDsc 4.2.0.0
  • Add support for modifying staging quota size in MSFT_DFSReplicationGroupMembership – fixes Issue 77.
  • Refactored module folder structure to move resource to root folder of repository and remove test harness – fixes Issue 74.
  • Updated Examples to support deployment to PowerShell Gallery scripts.
  • Remove exclusion of all tags in appveyor.yml, so all common tests can be run if opt-in.
  • Added .VSCode settings for applying DSC PSSA rules – fixes Issue 75.
  • Updated LICENSE file to match the Microsoft Open Source Team standard – fixes Issue 79
NetworkingDsc 6.2.0.0
  • Added .VSCode settings for applying DSC PSSA rules – fixes Issue 357.
  • Updated LICENSE file to match the Microsoft Open Source Team standard – fixes Issue 363
  • MSFT_NetIPInterface:
    • Added a new resource for configuring the IP interface settings for a network interface.
SecurityPolicyDsc 2.6.0.0
  • Added SecurityOption – Network_access_Restrict_clients_allowed_to_make_remote_calls_to_SAM
  • Bug fix – Issue 105 – Spelling error in SecurityOption”User_Account_Control_Behavior_of_the_elevation_prompt_for_standard_users”
  • Bug fix – Issue 90 – Corrected value for Microsoft_network_server_Server_SPN_target_name_validation_level policy
SharePointDsc 3.0.0.0
  • Changes to SharePointDsc
    • Added support for SharePoint 2019
    • Added CredSSP requirement to the Readme files
    • Added VSCode Support for running SharePoint 2019 unit tests
    • Removed the deprecated resources SPCreateFarm and SPJoinFarm (replaced in v2.0 by SPFarm)
  • SPBlobCacheSettings
    • Updated the Service Instance retrieval to be language independent
  • SPConfigWizard
    • Fixed check for Ensure=Absent in the Set method
  • SPInstallPrereqs
    • Added support for detecting updated installation of Microsoft Visual C++ 2015/2017 Redistributable (x64) for SharePoint 2016 and SharePoint 2019.
  • SPSearchContentSource
    • Added support for Business Content Source Type
  • SPSearchMetadataCategory
    • New resource added
  • SPSearchServiceApp
    • Updated resource to make sure the presence of the service app proxy is checked and created if it does not exist
  • SPSecurityTokenServiceConfig
    • The resource only tested for the Ensure parameter. Added more parameters
  • SPServiceAppSecurity
    • Added support for specifying array of access levels.
    • Changed implementation to use Grant-SPObjectSecurity with Replace switch instead of using a combination of Revoke-SPObjectSecurity and Grant-SPObjectSecurity
    • Added all supported access levels as available values.
    • Removed unknown access levels: Change Permissions, Write, and Read
  • SPUserProfileProperty
    • Removed obsolete parameters (MappingConnectionName, MappingPropertyName, MappingDirection) and introduced new parameter PropertyMappings
  • SPUserProfileServiceApp
    • Updated the check for successful creation of the service app to throw an error if this is not done correctly The following changes will break v2.x and earlier configurations that use these resources:
  • Implemented IsSingleInstance parameter to force that the resource can only be used once in a configuration for the following resources:
    • SPAntivirusSettings
    • SPConfigWizard
    • SPDiagnosticLoggingSettings
    • SPFarm
    • SPFarmAdministrators
    • SPInfoPathFormsServiceConfig
    • SPInstall
    • SPInstallPrereqs
    • SPIrmSettings
    • SPMinRoleCompliance
    • SPPasswordChangeSettings
    • SPProjectServerLicense
    • SPSecurityTokenServiceConfig
    • SPShellAdmin
  • Standardized Url/WebApplication parameter to default WebAppUrl parameter for the following resources:
    • SPDesignerSettings
    • SPFarmSolution
    • SPSelfServiceSiteCreation
    • SPWebAppBlockedFileTypes
    • SPWebAppClientCallableSettings
    • SPWebAppGeneralSettings
    • SPWebApplication
    • SPWebApplicationAppDomain
    • SPWebAppSiteUseAndDeletion
    • SPWebAppThrottlingSettings
    • SPWebAppWorkflowSettings
  • Introduced new mandatory parameters
    • SPSearchResultSource: Added option to create Result Sources at different scopes.
    • SPServiceAppSecurity: Changed parameter AccessLevel to AccessLevels in MSFT_SPServiceAppSecurityEntry to support array of access levels.
    • SPUserProfileProperty: New parameter PropertyMappings
SharePointDsc 3.1.0.0
  • Changes to SharePointDsc
    • Updated LICENSE file to match the Microsoft Open Source Team standard.
  • ProjectServerConnector
    • Added a file hash validation check to prevent the ability to load custom code into the module.
  • SPFarm
    • Fixed localization issue where TypeName was in the local language.
  • SPInstallPrereqs
    • Updated links in the Readme.md file to docs.microsoft.com.
    • Fixed required prereqs for SharePoint 2019, added MSVCRT11.
  • SPManagedMetadataServiceApp
    • Fixed issue where Get-TargetResource method throws an error when the service app proxy does not exist.
  • SPSearchContentSource
    • Corrected issue where the New-SPEnterpriseSearchCrawlContentSource cmdlet was called twice.
  • SPSearchServiceApp
    • Fixed issue where Get-TargetResource method throws an error when the service application pool does not exist.
    • Implemented check to make sure cmdlets are only executed when it actually has something to update.
    • Deprecated WindowsServiceAccount parameter and moved functionality to new resource (SPSearchServiceSettings).
  • SPSearchServiceSettings
    • Added new resource to configure search service settings.
  • SPServiceAppSecurity
    • Fixed unavailable utility method (ExpandAccessLevel).
    • Updated the schema to no longer specify username as key for the sub class.
  • SPUserProfileServiceApp
    • Fixed issue where localized versions of Windows and SharePoint would throw an error.
  • SPUserProfileSyncConnection
    • Corrected implementation of Ensure parameter.
StorageDsc 4.3.0.0
  • WaitForDisk:
    • Added readonly-property isAvailable which shows the current state of the disk as a boolean – fixes Issue 158.
xBitlocker 1.3.0.0
  • Update appveyor.yml to use the default template.
  • Added default template files .gitattributes, and .vscode settings.
  • Fixes most PSScriptAnalyzer issues.
  • Fix issue where AutoUnlock is not set if requested, if the disk was originally encrypted and AutoUnlock was not used.
  • Add remaining Unit Tests for xBitlockerCommon.
  • Add Unit tests for MSFT_xBLTpm
  • Add remaining Unit Tests for xBLAutoBitlocker
  • Add Unit tests for MSFT_xBLBitlocker
  • Moved change log to CHANGELOG.md file
  • Fixed Markdown validation warnings in README.md
  • Added .MetaTestOptIn.json file to root of module
  • Add Integration Tests for module resources
  • Rename functions with improper Verb-Noun constructs
  • Add comment based help to any functions without it
  • Update Schema.mof Description fields
  • Fixes issue where Switch parameters are passed to Enable-Bitlocker even if the corresponding DSC resource parameter was set to False (Issue 12)
xExchange 1.25.0.0
  • Opt-in for the common test flagged Script Analyzer rules (issue 234).
  • Opt-in for the common test testing for relative path length.
  • Removed the property PSDscAllowPlainTextPassword from all examples so the examples are secure by default. The property PSDscAllowPlainTextPassword was previously needed to (test) compile the examples in the CI pipeline, but now the CI pipeline is using a certificate to compile the examples.
  • Opt-in for the common test that validates the markdown links.
  • Fix typo of the word “Certificate” in several example files.
  • Add spaces between array members.
  • Add initial set of Unit Tests (mostly Get-TargetResource tests) for all remaining resource files.
  • Add WaitForComputerObject parameter to xExchWaitForDAG
  • Add spaces between comment hashtags and comments.
  • Add space between variable types and variables.
  • Fixes issue where xExchMailboxDatabase fails to test for a Journal Recipient because the module did not load the Get-Recipient cmdlet (335).
  • Fixes broken Integration tests in MSFT_xExchMaintenanceMode.Integration.Tests.ps1 (336).
  • Fix issue where Get-ReceiveConnector against an Absent connector causes an error to be logged in the MSExchange Management log.
  • Rename poorly named functions in xExchangeDiskPart.psm1 and MSFT_xExchAutoMountPoint.psm1, and add comment based help.
xHyper-V 3.14.0.0
  • MSFT_xVMHost:
    • Added support to Enable / Disable VM Live Migration. Fixes Issue 155.

How to Find Released DSC Resource Modules

To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

Of course, you can also always use PowerShellGet (available starting in WMF 5.0) to find modules with DSC Resources:

# To list all modules that tagged as DSCResourceKit
Find-Module -Tag DSCResourceKit 
# To list all DSC resources from all sources 
Find-DscResource

Please note only those modules released by the PowerShell Team are currently considered part of the ‘DSC Resource Kit’ regardless of the presence of the ‘DSC Resource Kit’ tag in the PowerShell Gallery.

To find a specific module, go directly to its URL on the PowerShell Gallery:
http://www.powershellgallery.com/packages/< module name >
For example:
http://www.powershellgallery.com/packages/xWebAdministration

How to Install DSC Resource Modules From the PowerShell Gallery

We recommend that you use PowerShellGet to install DSC resource modules:

Install-Module -Name < module name >

For example:

Install-Module -Name xWebAdministration

To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

Update-Module

After installing modules, you can discover all DSC resources available to your local system with this command:

Get-DscResource

How to Find DSC Resource Modules on GitHub

All resource modules in the DSC Resource Kit are available open-source on GitHub.
You can see the most recent state of a resource module by visiting its GitHub page at:
https://github.com/PowerShell/< module name >
For example, for the CertificateDsc module, go to:
https://github.com/PowerShell/CertificateDsc.

All DSC modules are also listed as submodules of the DscResources repository in the DscResources folder and the xDscResources folder.

How to Contribute

You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
See our contributing guide for more info on how to become a DSC Resource Kit contributor.

If you would like to help, please take a look at the list of open issues for the DscResources repository.
You can also check issues for specific resource modules by going to:
https://github.com/PowerShell/< module name >/issues
For example:
https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

Your help in developing the DSC Resource Kit is invaluable to us!

Questions, comments?

If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

Katie Kragenbrink
Software Engineer
PowerShell DSC Team
@katiedsc (Twitter)
@kwirkykat (GitHub)

DSC Resource Kit Release January 2019

$
0
0

We just released the DSC Resource Kit!

This release includes updates to 14 DSC resource modules. In the past 6 weeks, 41 pull requests have been merged and 54 issues have been closed, all thanks to our amazing community!

The modules updated in this release are:

  • ActiveDirectoryCSDsc
  • AuditPolicyDsc
  • CertificateDsc
  • ComputerManagementDsc
  • NetworkingDsc
  • SecurityPolicyDsc
  • SqlServerDsc
  • StorageDsc
  • xActiveDirectory
  • xBitlocker
  • xExchange
  • xFailOverCluster
  • xHyper-V
  • xWebAdministration

Several of these modules were released to remove the hidden files/folders from this issue. This issue should now be fixed for all modules except DFSDsc which is waiting for some fixes to its tests.

For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

Our latest community call for the DSC Resource Kit was today, January 9. A recording is available on YouTube here. Join us for the next call at 12PM (Pacific time) on February 13 to ask questions and give feedback about your experience with the DSC Resource Kit.

The next DSC Resource Kit release will be on Wednesday, February 20.

We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

Please see our documentation here for information on the support of these resource modules.

Included in this Release

You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or CHANGELOG.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

Module Name Version Release Notes
ActiveDirectoryCSDsc 3.1.0.0
  • Updated LICENSE file to match the Microsoft Open Source Team standard.
  • Added .VSCode settings for applying DSC PSSA rules – fixes Issue 60.
  • Added fix for two tier PKI deployment fails on initial deployment, not error – fixes Issue 57.
AuditPolicyDsc 1.4.0.0
  • Explicitly removed extra hidden files from release package
CertificateDsc 4.3.0.0
  • Updated certificate import to only use Import-CertificateEx – fixes Issue 161
  • Update LICENSE file to match the Microsoft Open Source Team standard -fixes Issue 164.
  • Opted into Common Tests – fixes Issue 168:
    • Required Script Analyzer Rules
    • Flagged Script Analyzer Rules
    • New Error-Level Script Analyzer Rules
    • Custom Script Analyzer Rules
    • Validate Example Files To Be Published
    • Validate Markdown Links
    • Relative Path Length
  • CertificateExport:
    • Fixed bug causing PFX export with matchsource enabled to fail – fixes Issue 117
ComputerManagementDsc 6.1.0.0
  • Updated LICENSE file to match the Microsoft Open Source Team standard. Fixes Issue 197.
  • Explicitly removed extra hidden files from release package
NetworkingDsc 6.3.0.0
  • MSFT_IPAddress:
    • Updated to allow retaining existing addresses in order to support cluster configurations as well
SecurityPolicyDsc 2.7.0.0
  • Bug fix – Issue 83 – Network_access_Remotely_accessible_registry_paths_and_subpaths correctly applies multiple paths
  • Update LICENSE file to match the Microsoft Open Source Team standard
SqlServerDsc 12.2.0.0
  • Changes to SqlServerDsc
    • During testing in AppVeyor the Build Worker is restarted in the install step to make sure the are no residual changes left from a previous SQL Server install on the Build Worker done by the AppVeyor Team (issue 1260).
    • Code cleanup: Change parameter names of Connect-SQL to align with resources.
    • Updated README.md in the Examples folder.
      • Added a link to the new xADObjectPermissionEntry examples in ActiveDirectory, fixed a broken link and a typo. Adam Rush (@adamrushuk)
    • Change to SqlServerLogin so it doesn”t check properties for absent logins.
StorageDsc 4.4.0.0
  • Refactored module folder structure to move resource to root folder of repository and remove test harness – fixes Issue 169.
  • Updated Examples to support deployment to PowerShell Gallery scripts.
  • Removed limitation on using Pester 4.0.8 during AppVeyor CI.
  • Moved the Code of Conduct text out of the README.md and into a CODE_OF_CONDUCT.md file.
  • Explicitly removed extra hidden files from release package
xActiveDirectory 2.23.0.0
  • Explicitly removed extra hidden files from release package
xBitlocker 1.4.0.0
  • Change double quoted string literals to single quotes
  • Add spaces between array members
  • Add spaces between variable types and variable names
  • Add spaces between comment hashtag and comments
  • Explicitly removed extra hidden files from release package
xExchange 1.26.0.0
  • Add support for Exchange Server 2019
  • Added additional parameters to the MSFT_xExchUMService resource
  • Rename improperly named functions, and add comment based help in MSFT_xExchClientAccessServer, MSFT_xExchDatabaseAvailabilityGroupNetwork, MSFT_xExchEcpVirtualDirectory, MSFT_xExchExchangeCertificate, MSFT_xExchImapSettings.
  • Added additional parameters to the MSFT_xExchUMCallRouterSettings resource
  • Rename improper function names in MSFT_xExchDatabaseAvailabilityGroup, MSFT_xExchJetstress, MSFT_xExchJetstressCleanup, MSFT_xExchMailboxDatabase, MSFT_xExchMailboxDatabaseCopy, MSFT_xExchMailboxServer, MSFT_xExchMaintenanceMode, MSFT_xExchMapiVirtualDirectory, MSFT_xExchOabVirtualDirectory, MSFT_xExchOutlookAnywhere, MSFT_xExchOwaVirtualDirectory, MSFT_xExchPopSettings, MSFT_xExchPowershellVirtualDirectory, MSFT_xExchReceiveConnector, MSFT_xExchWaitForMailboxDatabase, and MSFT_xExchWebServicesVirtualDirectory.
  • Add remaining unit and integration tests for MSFT_xExchExchangeServer.
xFailOverCluster 1.12.0.0
  • Explicitly removed extra hidden files from release package
xHyper-V 3.15.0.0
  • Explicitly removed extra hidden files from release package
xWebAdministration 2.4.0.0
  • Explicitly removed extra hidden files from release package

How to Find Released DSC Resource Modules

To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

Of course, you can also always use PowerShellGet (available starting in WMF 5.0) to find modules with DSC Resources:

# To list all modules that tagged as DSCResourceKit
Find-Module -Tag DSCResourceKit 
# To list all DSC resources from all sources 
Find-DscResource

Please note only those modules released by the PowerShell Team are currently considered part of the ‘DSC Resource Kit’ regardless of the presence of the ‘DSC Resource Kit’ tag in the PowerShell Gallery.

To find a specific module, go directly to its URL on the PowerShell Gallery:
http://www.powershellgallery.com/packages/< module name >
For example:
http://www.powershellgallery.com/packages/xWebAdministration

How to Install DSC Resource Modules From the PowerShell Gallery

We recommend that you use PowerShellGet to install DSC resource modules:

Install-Module -Name < module name >

For example:

Install-Module -Name xWebAdministration

To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

Update-Module

After installing modules, you can discover all DSC resources available to your local system with this command:

Get-DscResource

How to Find DSC Resource Modules on GitHub

All resource modules in the DSC Resource Kit are available open-source on GitHub.
You can see the most recent state of a resource module by visiting its GitHub page at:
https://github.com/PowerShell/< module name >
For example, for the CertificateDsc module, go to:
https://github.com/PowerShell/CertificateDsc.

All DSC modules are also listed as submodules of the DscResources repository in the DscResources folder and the xDscResources folder.

How to Contribute

You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
See our contributing guide for more info on how to become a DSC Resource Kit contributor.

If you would like to help, please take a look at the list of open issues for the DscResources repository.
You can also check issues for specific resource modules by going to:
https://github.com/PowerShell/< module name >/issues
For example:
https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

Your help in developing the DSC Resource Kit is invaluable to us!

Questions, comments?

If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

Katie Kragenbrink
Software Engineer
PowerShell DSC Team
@katiedsc (Twitter)
@kwirkykat (GitHub)

Windows Security change affecting PowerShell

$
0
0

Windows Security change affecting PowerShell

January 9, 2019

The recent (1/8/2019) Windows security patch CVE-2019-0543, has introduced a breaking change for a PowerShell remoting scenario. It is a narrowly scoped scenario that should have low impact for most users.

The breaking change only affects local loopback remoting, which is a PowerShell remote connection made back to the same machine, while using non-Administrator credentials.

PowerShell remoting endpoints do not allow access to non-Administrator accounts by default. However, it is possible to modify endpoint configurations, or create new custom endpoint configurations, that do allow non-Administrator account access. So you would not be affected by this change, unless you explicitly set up loopback endpoints on your machine to allow non-Administrator account access.

Example of broken loopback scenario

# Create endpoint that allows Users group access
PS > Register-PSSessionConfiguration -Name MyNonAdmin -SecurityDescriptorSddl 'O:NSG:BAD:P(A;;GA;;;BA)(A;;GA;;;BU)S:P(AU;FA;GA;;;WD)(AU;SA;GXGW;;;WD)' -Force

# Create non-Admin credential
PS > $nonAdminCred = Get-Credential ~\NonAdminUser

# Create a loopback remote session to custom endpoint using non-Admin credential
PS > $session = New-PSSession -ComputerName localhost -ConfigurationName MyNonAdmin -Credential $nonAdminCred

New-PSSession : [localhost] Connecting to remote server localhost failed with the following error message : The WSMan
service could not launch a host process to process the given request.  Make sure the WSMan provider host server and
proxy are properly registered. For more information, see the about_Remote_Troubleshooting Help topic.
At line:1 char:1
+ New-PSSession -ComputerName localhost -ConfigurationName MyNonAdmin - ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : OpenError: (System.Manageme....RemoteRunspace:RemoteRunspace) [New-PSSession], PSRemotin
   gTransportException
    + FullyQualifiedErrorId : -2146959355,PSSessionOpenFailed

The above example fails only when using non-Administrator credentials, and the connection is made back to the same machine (localhost). Administrator credentials still work. And the above scenario will work when remoting off-box to another machine.

Example of working loopback scenario

# Create Admin credential
PS > $adminCred = Get-Credential ~\AdminUser

# Create a loopback remote session to custom endpoint using Admin credential
PS > $session = New-PSSession -ComputerName localhost -ConfigurationName MyNonAdmin -Credential $adminCred
PS > $session

 Id Name            ComputerName    ComputerType    State         ConfigurationName     Availability
 -- ----            ------------    ------------    -----         -----------------     ------------
  1 WinRM1          localhost       RemoteMachine   Opened        MyNonAdmin               Available

The above example uses Administrator credentials to the same MyNonAdmin custom endpoint, and the connection is made back to the same machine (localhost). The session is created successfully using Administrator credentials.

The breaking change is not in PowerShell but in a system security fix that restricts process creation between Windows sessions. This fix is preventing WinRM (which PowerShell uses as a remoting transport and host) from successfully creating the remote session host, for this particular scenario. There are no plans to update WinRM.

This affects Windows PowerShell and PowerShell Core 6 (PSCore6) WinRM based remoting.

This does not affect SSH remoting with PSCore6.

This does not affect JEA (Just Enough Administration) sessions.

A workaround for a loopback connection is to always use Administrator credentials.

Another option is to use PSCore6 with SSH remoting.

Paul Higinbotham
Senior Software Engineer
PowerShell Team

Parsing Text with PowerShell (1/3)

$
0
0

This is the first post in a three part series.

  • Part 1:
    • Useful methods on the String class
    • Introduction to Regular Expressions
    • The Select-String cmdlet
  • Part 2:
    • The -split operator
    • The -match operator
    • The switch statement
    • The Regex class
  • Part 3:
    • A real world, complete and slightly bigger, example of a switch-based parser

A task that appears regularly in my workflow is text parsing. It may be about getting a token from a single line of text or about turning the text output of native tools into structured objects so I can leverage the power of PowerShell.

I always strive to create structure as early as I can in the pipeline, so that later on I can reason about the content as properties on objects instead of as text at some offset in a string. This also helps with sorting, since the properties can have their correct type, so that numbers, dates etc. are sorted as such and not as text.

There are a number of options available to a PowerShell user, and I’m giving an overview here of the most common ones.

This is not a text about how to create a high performance parser for a language with a structured EBNF grammar. There are better tools available for that, for example ANTLR.

.Net methods on the string class

Any treatment of string parsing in PowerShell would be incomplete if it didn’t mention the methods on the string class.
There are a few methods that I’m using more often than others when parsing strings:

Name Description
Substring(int startIndex) Retrieves a substring from this instance. The substring starts at a specified character position and continues to the end of the string.
Substring(int startIndex, int length) Retrieves a substring from this instance. The substring starts at a specified character position and has a specified length.
IndexOf(string value) Reports the zero-based index of the first occurrence of the specified string in this instance.
IndexOf(string value, int startIndex) Reports the zero-based index of the first occurrence of the specified string in this instance. The search starts at a specified character position.
LastIndexOf(string value) Reports the zero-based index of the last occurrence of the specified string in this instance. Often used together with Substring.
LastIndexOf(string value, int startIndex) Reports the zero-based index position of the last occurrence of a specified string within this instance. The search starts at a specified character position and proceeds backward toward the beginning of the string.

This is a minor subset of the available functions. It may be well worth your time to read up on the string class since it is so fundamental in PowerShell.
Docs are found here.

As an example, this can be useful when we have very large input data of comma-separated input with 15 columns and we are only interested in the third column from the end. If we were to use the -split ',' operator, we would create 15 new strings and an array for each line. On the other hand, using LastIndexOf on the input string a few times and then SubString to get the value of interest is faster and results in just one new string.

function parseThirdFromEnd([string]$line){
    $i = $line.LastIndexOf(",")             # get the last separator
    $i = $line.LastIndexOf(",", $i - 1)     # get the second to last separator, also the end of the column we are interested in
    $j = $line.LastIndexOf(",", $i - 1)     # get the separator before the column we want
    $j++                                    # more forward past the separator
    $line.SubString($j,$i-$j)               # get the text of the column we are looking for
}

In this sample, I ignore that the IndexOf and LastIndexOf returns -1 if they cannot find the text to search for. From experience, I also know that it is easy to mess up the index arithmetics.
So while using these methods can improve performance, it is also more error prone and a lot more to type. I would only resort to this when I know the input data is very large and performance is an issue. So this is not a recommendation, or a starting point, but something to resort to.

On rare occasions, I write the whole parser in C#. An example of this is in a module wrapping the Perforce version control system, where the command line tool can output python dictionaries. It is a binary format, and the use case was complicated enough that I was more comfortable with a compiler checked implementation language.

Regular Expressions

Almost all of the parsing options in PowerShell make use of regular expressions, so I will start with a short intro of some regular expression concepts that are used later in these posts.

Regular expressions are very useful to know when writing simple parsers since they allow us to express patterns of interest and to capture text that matches those patterns.

It is a very rich language, but you can get quite a long way by learning a few key parts. I’ve found regular-expressions.info to be a good online resource for more information. It is not written directly for the .net regex implementation, but most of the information is valid across the different implementations.

Regex Description
* Zero or more of the preceding character. a* matches the empty string, a, aa, etc, but not b.
+ One or more of the preceding character. a+ matches a, aa, etc, but not the empty string or b.
. Matches any character
[ax1] Any of a,x,1
a-d matches any of a,b,c,d
\w The \w meta character is used to find a word character. A word character is a character from a-z, A-Z, 0-9, including the _ (underscore) character. It also matches variants of the characters such as ??? and ???.
\W The inversion of \w. Matches any non-word character
\s The \s meta character is used to find white space
\S The inversion of \s. Matches any non-whitespace character
\d Matches digits
\D The inversion of \d. Matches non-digits
\b Matches a word boundary, that is, the position between a word and a space.
\B The inversion of \b. . er\B matches the er in verb but not the er in never.
^ The beginning of a line
$ The end of a line
(<expr>) Capture groups

Combining these, we can create a pattern like below to match a text like:

Text Pattern
" 42,Answer" ^\s+\d+,.+

The above pattern can be written like this using the x (ignore pattern whitespace) modifier.

Starting the regex with (?x) ignores whitespace in the pattern (it has to be specified explicitly, with \s) and also enables the comment character #.

(?x)  # this regex ignores whitespace in the pattern. Makes it possible do document a regex with comments.
^     # the start of the line
\s+   # one or more whitespace character
\d+   # one or more digits
,     # a comma
.+    # one or more characters of any kind

By using capture groups, we make it possible to refer back to specific parts of a matched expression.

Text Pattern
" 42,Answer" ^\s+(\d+),(.+)
(?x)  # this regex ignores whitespace in the pattern. Makes it possible to document a regex with comments.
^     # the start of the line
\s+   # one or more whitespace character
(\d+) # capture one or more digits in the first group (index 1)
,     # a comma
(.+)  # capture one or more characters of any kind in the second group (index 2)

Naming regular expression groups

There is a construct called named capturing groups, (?<group_name>pattern), that will create a capture group with a designated name.

The regex above can be rewritten like this, which allows us to refer to the capture groups by name instead of by index.

^\s+(?<num>\d+),(?<text>.+)

Different languages have implementation specific solutions to accessing the values of the captured groups. We will see later on in this series how it is done in PowerShell.

The Select-String cmdlet

The Select-String command is a work horse, and is very powerful when you understand the output it produces.
I use it mainly when searching for text in files, but occasionally also when looking for something in command output and similar.

The key to being efficient with Select-String is to know how to get to the matched patterns in the output. In its internals, it uses the same regex class as the -match and -split operator, but instead of populating a global variable with the resulting groups, as -match does, it writes an object to the pipeline, with a Matches property that contains the results of the match.

Set-Content twitterData.txt -value @"
Lee, Steve-@Steve_MSFT,2992
Lee Holmes-13000 @Lee_Holmes
Staffan Gustafsson-463 @StaffanGson
Tribbiani, Joey-@Matt_LeBlanc,463400
"@

# extracting captured groups
Get-ChildItem twitterData.txt |
    Select-String -Pattern "^(\w+) ([^-]+)-(\d+) (@\w+)" |
    Foreach-Object {
        $first, $last, $followers, $handle = $_.Matches[0].Groups[1..4].Value   # this is a common way of getting the groups of a call to select-string
        [PSCustomObject] @{
            FirstName = $first
            LastName = $last
            Handle = $handle
            TwitterFollowers = [int] $followers
        }
    }
FirstName LastName   Handle       TwitterFollowers
--------- --------   ------       ----------------
Lee       Holmes     @Lee_Holmes             13000
Staffan   Gustafsson @StaffanGson              463

Support for Multiple Patterns

As we can see above, only half of the data matched the pattern to Select-String.

A technique that I find useful is to take advantage of the fact that Select-String supports the use of multiple patterns.

The lines of input data in twitterData.txt contain the same type of information, but they’re formatted in slightly different ways.
Using multiple patterns in combination with named capture groups makes it a breeze to extract the groups even when the positions of the groups differ.

$firstLastPattern = "^(?<first>\w+) (?<last>[^-]+)-(?<followers>\d+) (?<handle>@.+)"
$lastFirstPattern = "^(?<last>[^\s,]+),\s+(?<first>[^-]+)-(?<handle>@[^,]+),(?<followers>\d+)"
Get-ChildItem twitterData.txt |
     Select-String -Pattern $firstLastPattern, $lastFirstPattern |
    Foreach-Object {
        # here we access the groups by name instead of by index
        $first, $last, $followers, $handle = $_.Matches[0].Groups['first', 'last', 'followers', 'handle'].Value
        [PSCustomObject] @{
            FirstName = $first
            LastName = $last
            Handle = $handle
            TwitterFollowers = [int] $followers
        }
    }
FirstName LastName   Handle        TwitterFollowers
--------- --------   ------        ----------------
Steve     Lee        @Steve_MSFT               2992
Lee       Holmes     @Lee_Holmes              13000
Staffan   Gustafsson @StaffanGson               463
Joey      Tribbiani  @Matt_LeBlanc           463400

Breaking down the $firstLastPattern gives us

(?x)                # this regex ignores whitespace in the pattern. Makes it possible do document a regex with comments.
^                   # the start of the line
(?<first>\w+)       # capture one or more of any word characters into a group named 'first'
\s                  # a space
(?<last>[^-]+)      # capture one of more of any characters but `-` into a group named 'last'
-                   # a '-'
(?<followers>\d+)   # capture 1 or more digits into a group named 'followers'
\s                  # a space
(?<handle>@.+)      # capture a '@' followed by one or more characters into a group named 'handle'

The second regex is similar, but with the groups in different order. But since we retrieve the groups by name, we don’t have to care about the positions of the capture groups, and multiple assignment works fine.

Context around Matches

Select-String also has a Context parameter which accepts an array of one or two numbers specifying the number of lines before and after a match that should be captured. All text parsing techniques in this post can be used to parse information from the context lines.
The result object has a Context property, that returns an object with PreContext and PostContext properties, both of the type string[].

This can be used to get the second line before a match:

# using the context property
Get-ChildItem twitterData.txt |
    Select-String -Pattern "Staffan" -Context 2,1 |
    Foreach-Object { $_.Context.PreContext[1], $_.Context.PostContext[0] }
Lee Holmes-13000 @Lee_Holmes
Tribbiani, Joey-@Matt_LeBlanc,463400

To understand the indexing of the Pre- and PostContext arrays, consider the following:

Lee, Steve-@Steve_MSFT,2992                  <- PreContext[0]
Lee Holmes-13000 @Lee_Holmes                 <- PreContext[1]
Staffan Gustafsson-463 @StaffanGson          <- Pattern matched this line
Tribbiani, Joey-@Matt_LeBlanc,463400         <- PostContext[0]

The pipeline support of Select-String makes it different from the other parsing tools available in PowerShell, and makes it the undisputed king of one-liners.

I would like stress how much more useful Select-String becomes once you understand how to get to the parts of the matches.

Summary

We have looked at useful methods of the string class, especially how to use Substring to get to text at a specific offset. We also looked at regular expression, a language used to describe patterns in text, and on the Select-String cmdlet, which makes heavy use of regular expression.

Next time, we will look at the operators -split and -match, the switch statement (which is surprisingly useful for text parsing), and the regex class.

Staffan Gustafsson, @StaffanGson, github

Thanks to Jason Shirk, Mathias Jessen and Steve Lee for reviews and feedback.

Announcing the PowerShell Preview Extension in VSCode

$
0
0

Preview builds of the PowerShell extension are now available in VSCode

We are excited to announce the PowerShell Preview extension in the VSCode marketplace!
The PowerShell Preview extension allows users on Windows PowerShell 5.1, Powershell 6.0, and all newer versions to get and test the latest updates to the PowerShell extension and comes with some exciting features. The PowerShell Preview extension is a substitute for the PowerShell extension so both the PowerShell extension and the PowerShell Preview
extension should not be enabled at the same time.

Features of the PowerShell Preview extension

The PowerShell Preview extension is built on .NET Standard thereby enabling simplification of our code and dependency structure.

The PowerShell Preview extension also contains PSReadLine support in the integrated console for Windows behind a
feature flag. PSReadLine provides a consistent and rich interactive experience, including syntax coloring and
multi-line editing and history, in the PowerShell console, in Cloud Shell, and now in VSCode terminal.
For more information on the benefits of PSReadLine, check out their documentation.

To enable PSReadLine support in the Preview version on Windows, please add the following to your user settings:

"powershell.developer.featureFlags": [ "PSReadLine" ]

HUGE thanks to @SeeminglyScience for all his amazing work getting PSReadLine working in PowerShell Editor Services!

Why we created the PowerShell Preview extension

By having a preview channel, which supports Windows Powershell 5.1 and PowerShell Core 6, in addition to our existing stable channel, we can get new features out faster. PSReadLine support for the VSCode integrated console is a great
example of a feature that the preview build makes possible. Having a preview channel also allows us to get more feedback
on new features and to iterate on changes before they arrive in our stable channel.

How to Get/Use the PowerShell Preview extension

If you dont already have VSCode, start here.

Once you have VSCode open, click Clt+Shift+X to open the extensions marketplace.
Next, type PowerShell Preview in the search bar.
Click Install on the PowerShell Preview page.
Finally, click Reload in order to refresh VSCode.

If you already have the PowerShell extension please disable it to use the Powershell Preview extension.
To disable the PowerShell extension find it in the Extensions sidebar view, specifically under the list of Enabled extensions, Right-click on the PowerShell extension and select Disable. Please note that it is important to only have either the
PowerShell extension or the PowerShell Preview extension endabled at one time.

Breaking Changes

As stated above, this version of the PowerShell extension only works with Windows PowerShell versions 5.1 and
PowerShell Core 6.

Reporting Feedback

An important benefit of a preview extension is getting feedback from users.
To report issues with the extension use our GitHub repo.
When reporting issues be sure to specify the version of the extension you are using.

Sydney Smith
Program Manager
PowerShell Team

The PowerShell-Docs repositories have been moved

$
0
0

The PowerShell-Docs repositories have been moved from the PowerShell organization to the MicrosoftDocs organization in GitHub.

The tools we use to build the documentation are designed to work in the MicrosoftDocs org. Moving the repository lets us build the foundation for future improvements in our documentation experience.

Impact of the move

During the move there may be some downtime. The affected repositories will be inaccessible during
the move process. Also, the documentation processes will be paused. After the move, we need to test
access permissions and automation scripts.
After these tasks are complete, access and operations will return to normal. GitHub automatically
redirects requests to the old repo URL to the new location.
For more information about transferring repositories in GitHub, see About repository transfers

If the transferred repository has any forks, then those forks will remain associated with the
repository after the transfer is complete.
  • All Git information about commits, including contributions, are preserved.
  • All of the issues and pull requests remain intact when transferring a repository.
  • All links to the previous repository location are automatically redirected to the new location.


When you use git clone, git fetch, or git push on a transferred repository, these commands will r
edirect to the new repository location or URL.

However, to avoid confusion, we strongly recommend updating any existing local clones to point to
the new repository URL.
For more information, see Changing a remote’s URL.

The following example shows how to change the “upstream” remote to point to the new location:

[Wed 06:08PM] [staging =]
PS C:\Git\PS-Docs\PowerShell-Docs> git remote -v
origin  https://github.com/sdwheeler/PowerShell-Docs.git (fetch)
origin  https://github.com/sdwheeler/PowerShell-Docs.git (push)
upstream        https://github.com/PowerShell/PowerShell-Docs.git (fetch)
upstream        https://github.com/PowerShell/PowerShell-Docs.git (push)

[Wed 06:09PM] [staging =]
PS C:\Git\PS-Docs\PowerShell-Docs> git remote set-url upstream https://github.com/MicrosoftDocs/PowerShell-Docs.git

[Wed 06:10PM] [staging =]
PS C:\Git\PS-Docs\PowerShell-Docs> git remote -v
origin  https://github.com/sdwheeler/PowerShell-Docs.git (fetch)
origin  https://github.com/sdwheeler/PowerShell-Docs.git (push)
upstream        https://github.com/MicrosoftDocs/PowerShell-Docs.git (fetch)
upstream        https://github.com/MicrosoftDocs/PowerShell-Docs.git (push)


Which repositories were moved?

 

The following repositories were transferred:

  • PowerShell/PowerShell-Docs
  • PowerShell/powerShell-Docs.cs-cz
  • PowerShell/powerShell-Docs.de-de
  • PowerShell/powerShell-Docs.es-es
  • PowerShell/powerShell-Docs.fr-fr
  • PowerShell/powerShell-Docs.hu-hu
  • PowerShell/powerShell-Docs.it-it
  • PowerShell/powerShell-Docs.ja-jp
  • PowerShell/powerShell-Docs.ko-kr
  • PowerShell/powerShell-Docs.nl-nl
  • PowerShell/powerShell-Docs.pl-pl
  • PowerShell/powerShell-Docs.pt-br
  • PowerShell/powerShell-Docs.pt-pt
  • PowerShell/powerShell-Docs.ru-ru
  • PowerShell/powerShell-Docs.sv-se
  • PowerShell/powerShell-Docs.tr-tr
  • PowerShell/powerShell-Docs.zh-cn
  • PowerShell/powerShell-Docs.zh-tw

Call to action

If you have a fork that you cloned, change your remote configuration to point to the new upstream URL.
Help us make the documentation better.
  • Submit issues when you find a problem in the docs.
  • Suggest fixes to documentation by submitting changes through the PR process.
Sean Wheeler
Senior Content Developer for PowerShell
https://github.com/sdwheeler

Parsing Text with PowerShell (2/3)

$
0
0

This is the second post in a three-part series.

  • Part 1:
    • Useful methods on the String class
    • Introduction to Regular Expressions
    • The Select-String cmdlet
  • Part 2:
    • the -split operator
    • the -match operator
    • the switch statement
    • the Regex class
  • Part 3:
    • a real world, complete and slightly bigger, example of a switch-based parser

The -split operator

The -split operator splits one or more strings into substrings.

The first example is a name-value pattern, which is a common parsing task. Note the usage of the Max-substrings parameter to the -split operator.
We want to ensure that it doesn’t matter if the value contains the character to split on.

$text = "Description=The '=' character is used for assigning values to a variable"
$name, $value = $text -split "=", 2

@"
Name  =  $name
Value =  $value
"@
Name  =  Description
Value =  The '=' character is used for assigning values to a variable

When the line to parse contains fields separated by a well known separator, that is never a part of the field values, we can use the -split operator in combination with multiple assignment to get the fields into variables.

$name, $location, $occupation = "Spider Man,New York,Super Hero" -split ','

If only the location is of interest, the unwanted items can be assigned to $null.

$null, $location, $null = "Spider Man,New York,Super Hero" -split ','

$location
New York

If there are many fields, assigning to null doesn’t scale well. Indexing can be used instead, to get the fields of interest.

$inputText = "x,Staffan,x,x,x,x,x,x,x,x,x,x,Stockholm,x,x,x,x,x,x,x,x,11,x,x,x,x"
$name, $location, $age = ($inputText -split ',')[1,12,21]

$name
$location
$age
Staffan
Stockholm
11

It is almost always a good idea to create an object that gives context to the different parts.

$inputText = "x,Steve,x,x,x,x,x,x,x,x,x,x,Seattle,x,x,x,x,x,x,x,x,22,x,x,x,x"
$name, $location, $age = ($inputText -split ',')[1,12,21]
[PSCustomObject] @{
    Name = $name
    Location = $location
    Age = [int] $age
}
Name  Location Age
----  -------- ---
Steve Seattle   22

Instead of creating a PSCustomObject, we can create a class. It’s a bit more to type, but we can get more help from the engine, for example with tab completion.

The example below also shows an example of type conversion, where the default string to number conversion doesn’t work.
The age field is handled by PowerShell’s built-in type conversion. It is of type [int], and PowerShell will handle the conversion from string to int,
but in some cases we need to help out a bit. The ShoeSize field is also an [int], but the data is hexadecimal,
and without the hex specifier (‘0x’), this conversion fails for some values, and provides incorrect results for the others.

class PowerSheller {
    [string] $Name
    [string] $Location
    [int] $Age
    [int] $ShoeSize
}

$inputText = "x,Staffan,x,x,x,x,x,x,x,x,x,x,Stockholm,x,x,x,x,x,x,x,x,33,x,11d,x,x"
$name, $location, $age, $shoeSize = ($inputText -split ',')[1,12,21,23]
[PowerSheller] @{
    Name = $name
    Location = $location
    Age = $age
    # ShoeSize is expressed in hex, with no '0x' because reasons :)
    # And yes, it's in millimeters.
    ShoeSize = [Convert]::ToInt32($shoeSize, 16)
}
Name    Location  Age ShoeSize
----    --------  --- --------
Staffan Stockholm  33      285

The split operator’s first argument is actually a regex (by default, can be changed with options).
I use this on long command lines in log files (like those given to compilers) where there can be hundreds of options specified. This makes it hard to see if a certain option is specified or not, but when split into their own lines, it becomes trivial.
The pattern below uses a positive lookahead assertion.
It can be very useful to make patterns match only in a given context, like if they are, or are not, preceded or followed by another pattern.

$cmdline = "cl.exe /D Bar=1 /I SomePath /D Foo  /O2 /I SomeOtherPath /Debug a1.cpp a3.cpp a2.cpp"

$cmdline -split "\s+(?=[-/])"
cl.exe
/D Bar=1
/I SomePath
/D Foo
/O2
/I SomeOtherPath
/Debug a1.cpp a2.cpp

Breaking down the regex, by rewriting it with the x option:

(?x)      # ignore whitespace in the pattern, and enable comments after '#'
\s+       # one or more spaces
(?=[-/])  # only match the previous spaces if they are followed by any of '-' or '/'.

Splitting with a scriptblock

The -split operator also comes in another form, where you can pass it a scriptblock instead of a regular expression.
This allows for more complicated logic, that can be hard or impossible to express as a regular expression.

The scriptblock accepts two parameters, the text to split and the current index. $_ is bound to the character at the current index.

function SplitWhitespaceInMiddleOfText {
    param(
        [string]$Text,
        [int] $Index
    )
    if ($Index -lt 10 -or $Index -gt 40){
        return $false
    }
    $_ -match '\s'
}

$inputText = "Some text that only needs splitting in the middle of the text"
$inputText -split $function:SplitWhitespaceInMiddleOfText
Some text that
only
needs
splitting
in
the middle of the text

The $function:SplitWhitespaceInMiddleOfText syntax is a way to get to content (the scriptblock that implements it) of the function, just as $env:UserName gets the content of an item in the env: drive.
It provides a way to document and/or reuse the scriptblock.

The -match operator

The -match operator works in conjunction with the $matches automatic variable. Each time a -match or a -notmatch succeeds, the $matches variable is populated so that each capture group gets its own entry. If the capture group is named, the key will be the name of the group, otherwise it will be the index.

As an example:

if ('a b c' -match '(\w) (?<named>\w) (\w)'){
    $matches
}
Name                           Value
----                           -----
named                          b
2                              c
1                              a
0                              a b c

Notice that the indices only increase on groups without names. I.E. the indices of later groups change when a group is named.

Armed with the regex knowledge from the earlier post, we can write the following:

PS> "    10,Some text" -match '^\s+(\d+),(.+)'
True
PS> $matches
Name                           Value
----                           -----
2                              Some text
1                              10
0                              10,Some text

or with named groups

PS> "    10,Some text" -match '^\s+(?<num>\d+),(?<text>.+)'
True
PS> $matches
Name                           Value
----                           -----
num                            10
text                           Some text
0                              10,Some text

The important thing here is to put parentheses around the parts of the pattern that we want to extract. That is what creates the capture groups that allow us to reference those parts of the matching text, either by name or by index.

Combining this into a function makes it easy to use:

function ParseMyString($text){
    if ($text -match '^\s+(\d+),(.+)') {
        [PSCustomObject] @{
            Number = [int] $matches[1]
            Text    = $matches[2]
        }
    }
    else {
        Write-Warning "ParseMyString: Input `$text` doesn't match pattern"
    }
}

ParseMyString "    10,Some text"
Number  Text
------- ----
     10 Some text

Notice the type conversion when assigning the Number property. As long as the number is in range of an integer, this will always succeed, since we have made a successful match in the if statement above. ([long] or [bigint] could be used. In this case I provide the input, and I have promised myself to stick to a range that fits in a 32-bit integer.)
Now we will be able to sort or do numerical operations on the Number property, and it will behave like we want it to – as a number, not as a string.

The switch statement

Now we’re at the big guns 🙂

The switch statement in PowerShell has been given special functionality for parsing text.
It has two flags that are useful for parsing text and files with text in them. -regex and -file.

When specifying -regex, the match clauses that are strings are treated as regular expressions. The switch statement also sets the $matches automatic variable.

When specifying -file, PowerShell treats the input as a file name, to read input from, rather than as a value statement.

Note the use of a ScriptBlock instead of a string as the match clause to determine if we should skip preamble lines.

class ParsedOutput {
    [int] $Number
    [string] $Text

    [string] ToString() { return "{0} ({1})" -f $this.Text, $this.Number }
}

$inputData =
    "Preamble line",
    "LastLineOfPreamble",
    "    10,Some Text",
    "    Some other text,20"

$inPreamble = $true
switch -regex ($inputData) {

    {$inPreamble -and $_ -eq 'LastLineOfPreamble'} { $inPreamble = $false; continue }

    "^\s+(?<num>\d+),(?<text>.+)" {  # this matches the first line of non-preamble input
        [ParsedOutput] @{
            Number = $matches.num
            Text = $matches.text
        }
        continue
    }

    "^\s+(?<text>[^,]+),(?<num>\d+)" { # this matches the second line of non-preamble input
        [ParsedOutput] @{
            Number = $matches.num
            Text = $matches.text
        }
        continue
    }
}
Number  Text
------ ----
    10 Some Text
    20 Some other text

The pattern [^,]+ in the text group in the code above is useful. It means match anything that is not a comma ,. We are using the any-of construct [], and within those brackets, ^ changes meaning from the beginning of the line to anything but.

That is useful when we are matching delimited fields. A requirement is that the delimiter cannot be part of the set of allowed field values.

The regex class

regex is a type accelerator for System.Text.RegularExpressions.Regex. It can be useful when porting code from C#, and sometimes when we want to get more control in situations when we have many matches of a capture group. It also allows us to pre-create the regular expressions which can matter in performance sensitive scenarios, and to specify a timeout.

One instance where the regex class is needed is when you have multiple captures of a group.

Consider the following:

Text Pattern
a,b,c, (\w,)+

If the match operator is used, $matches will contain

Name                           Value
----                           -----
1                              c,
0                              a,b,c,

The pattern matched three times, for a,, b, and c,. However, only the last match is preserved in the $matches dictionary.
However, the following will allow us to get to all the captures of the group:

[regex]::match('a,b,c,', '(\w,)+').Groups[1].Captures
Index Length Value
----- ------ -----
    0      2 a,
    2      2 b,
    4      2 c,

Below is an example that uses the members of the Regex class to parse input data

class ParsedOutput {
    [int] $Number
    [string] $Text

    [string] ToString() { return "{0} ({1})" -f $this.Text, $this.Number }
}

$inputData =
    "    10,Some Text",
    "    Some other text,20"  # this text will not match

[regex] $pattern = "^\s+(\d+),(.+)"

foreach($d in $inputData){
    $match = $pattern.Match($d)
    if ($match.Success){
        $number, $text = $match.Groups[1,2].Value
        [ParsedOutput] @{
            Number = $number
            Text = $text
        }
    }
    else {
        Write-Warning "regex: '$d' did not match pattern '$pattern'"
    }
}
WARNING: regex: '    Some other text,20' did not match pattern '^\s+(\d+),(.+)'
Number Text
------ ----
    10 Some Text

It may surprise you that the warning appears before the output. PowerShell has a quite complex formatting system at the end of the pipeline, which treats pipeline output different than other streams. Among other things, it buffers output in the beginning of a pipeline to calculate sensible column widths. This works well in practice, but sometimes gives strange reordering of output on different streams.

Summary

In this post we have looked at how the -split operator can be used to split a string in parts, how the -match operator can be used to extract different patterns from some text, and how the powerful switch statement can be used to match against multiple patterns.

We ended by looking at how the regex class, which in some cases provides a bit more control, but at the expense of ease of use. This concludes the second part of this series. Next time, we will look at a complete, real world, example of a switch-based parser.

Thanks to Jason Shirk, Mathias Jessen and Steve Lee for reviews and feedback.

Staffan Gustafsson, @StaffanGson, powercode@github

Staffan works at DICE in Stockholm, Sweden, as a Software Engineer and has been using PowerShell since the first public beta.
He was most seriously pleased when PowerShell was open sourced, and has since contributed bug fixes, new features and performance improvements.
Staffan is a speaker at PSConfEU and is always happy to talk PowerShell.

Parsing Text with PowerShell (3/3)

$
0
0

This is the third and final post in a three-part series.

  • Part 1:
    • Useful methods on the String class
    • Introduction to Regular Expressions
    • The Select-String cmdlet
  • Part 2:
    • the -split operator
    • the -match operator
    • the switch statement
    • the Regex class
  • Part 3:
    • a real world, complete and slightly bigger, example of a switch-based parser
      • General structure of a switch-based parser
      • The real world example

In the previous posts, we looked at the different operators what are available to us in PowerShell.

When analyzing crashes at DICE, I noticed that some of the C++ runtime binaries where missing debug symbols. They should be available for download from Microsoft’s public symbol server, and most versions were there. However, due to some process errors at DevDiv, some builds were released publicly without available debug symbols.
In some cases, those missing symbols prevented us from debugging those crashes, and in all cases, they triggered my developer OCD.

So, to give actionable feedback to Microsoft, I scripted a debugger (cdb.exe in this case) to give a verbose list of the loaded modules, and parsed the output with PowerShell, which was also later used to group and filter the resulting data set. I sent this data to Microsoft, and 5 days later, the missing symbols were available for download. Mission accomplished!

This post will describe the parser I wrote for this task (it turned out that I had good use for it for other tasks later), and the general structure is applicable to most parsing tasks.

The example will show how a switch-based parser would look when the input data isn’t as tidy as it normally is in examples, but messy – as the real world data often is.

General Structure of a switch Based Parser

Depending on the structure of our input, the code must be organized in slightly different ways.

Input may have a record start that differs by indentation or some distinct token like

Foo                    <- Record start - No whitespace at the beginning of the line
    Prop1=Staffan      <- Properties for the record - starts with whitespace
    Prop3 =ValueN
Bar
    Prop1=Steve
    Prop2=ValueBar2

If the data to be parsed has an explicit start record, it is a bit easier than if it doesn’t have one.
We create a new data object when we get a record start, after writing any previously created object to the pipeline.
At the end, we need to check if we have parsed a record that hasn’t been written to the pipeline.

The general structure of a such a switch-based parser can be as follows:

$inputData = @"
Foo
    Prop1=Value1
    Prop3=Value3
Bar
    Prop1=ValueBar1
    Prop2=ValueBar2
"@ -split '\r?\n'   # This regex is useful to split at line endings, with or without carriage return

class SomeDataClass {
    $ID
    $Name
    $Property2
    $Property3
}

# map to project input property names to the properties on our data class
$propertyNameMap = @{
    Prop1 = "Name"
    Prop2 = "Property2"
    Prop3 = "Property3"
}

$currentObject = $null
switch -regex ($inputData) {

    '^(\S.*)' {
        # record start pattern, in this case line that doesn't start with a whitespace.
        if ($null -ne $currentObject) {
            $currentObject                   # output to pipeline if we have a previous data object
        }
        $currentObject = [SomeDataClass] @{  # create new object for this record
            Id = $matches.1                  # with Id like Foo or Bar
        }
        continue
    }

    # set the properties on the data object
    '^\s+([^=]+)=(.*)' {
        $name, $value = $matches[1, 2]
        # project property names
        $propName = $propertyNameMap[$name]
        if ($propName = $null) {
            $propName = $name
        }
        # assign the parsed value to the projected property name
        $currentObject.$propName = $value
        continue
    }
}

if ($currentObject) {
    # Handle the last object if any
    $currentObject # output to pipeline
}
ID  Name      Property2 Property3
--  ----      --------- ---------
Foo Value1              Value3
Bar ValueBar1 ValueBar2

Alternatively, we may have input where the records are separated by a blank line, but without any obvious record start.

commitId=1234                         <- In this case, a commitId is first in a record
description=Update readme.md
                                      <- the blank line separates records
user=Staffan                          <- For this record, a user property comes first
commitId=1235
description=Fix bug.md

In this case the structure of the code looks a bit different. We create an object at the beginning, but keep track of if it’s dirty or not.
If we get to the end with a dirty object, we must output it.

$inputData = @"

commit=1234
desc=Update readme.md

user=Staffan
commit=1235
desc=Bug fix

"@ -split "\r?\n"

class SomeDataClass {
    [int] $CommitId
    [string] $Description
    [string] $User
}

# map to project input property names to the properties on our data class
# we only need to provide the ones that are different. 'User' works fine as it is.
$propertyNameMap = @{
    commit = "CommitId"
    desc   = "Description"
}

$currentObject = [SomeDataClass]::new()
$objectDirty = $false
switch -regex ($inputData) {
    # set the properties on the data object
    '^([^=]+)=(.*)$' {
        # parse a name/value
        $name, $value = $matches[1, 2]
        # project property names
        $propName = $propertyNameMap[$name]
        if ($null -eq $propName) {
            $propName = $name
        }
        # assign the projected property
        $currentObject.$propName = $value
        $objectDirty = $true
        continue
    }

    '^\s*$' {
        # separator pattern, in this case any blank line
        if ($objectDirty) {
            $currentObject                           # output to pipeline
            $currentObject = [SomeDataClass]::new()  # create new object
            $objectDirty = $false                    # and mark it as not dirty
        }
    }
    default {
        Write-Warning "Unexpected input: '$_'"
    }
}

if ($objectDirty) {
    # Handle the last object if any
    $currentObject # output to pipeline
}
CommitId Description      User
-------- -----------      ----
    1234 Update readme.md
    1235 Bug fix          Staffan

The Real World Example

I have adapted this sample slightly so that I get the loaded modules from a running process instead of from my crash dumps. The format of the output from the debugger is the same.
The following command launches a command line debugger on notepad, with a script that gives a verbose listing of the loaded modules, and quits:

# we need to muck around with the console output encoding to handle the trademark chars
# imagine no encodings
# it's easy if you try
# no code pages below us
# above us only sky
[Console]::OutputEncoding = [System.Text.Encoding]::GetEncoding("iso-8859-1")

$proc = Start-Process notepad -passthru
Start-Sleep -seconds 1
$cdbOutput = cdb -y 'srv*c:\symbols*http://msdl.microsoft.com/download/symbols' -c ".reload -f;lmv;q" -p $proc.ProcessID

The output of the command above is here for those who want to follow along but who aren’t running windows or don’t have cdb.exe installed.

The (abbreviated) output looks like this:

Microsoft (R) Windows Debugger Version 10.0.16299.15 AMD64
Copyright (c) Microsoft Corporation. All rights reserved.

*** wait with pending attach

************* Path validation summary **************
Response                         Time (ms)     Location
Deferred                                       srv*c:\symbols*http://msdl.microsoft.com/download/symbols
Symbol search path is: srv*c:\symbols*http://msdl.microsoft.com/download/symbols
Executable search path is:
ModLoad: 00007ff6`e9da0000 00007ff6`e9de3000   C:\Windows\system32\notepad.exe
...
ModLoad: 00007ffe`97d80000 00007ffe`97db1000   C:\WINDOWS\SYSTEM32\ntmarta.dll
(98bc.40a0): Break instruction exception - code 80000003 (first chance)
ntdll!DbgBreakPoint:
00007ffe`9cd53050 cc              int     3
0:007> cdb: Reading initial command '.reload -f;lmv;q'
Reloading current modules
.....................................................
start             end                 module name
00007ff6`e9da0000 00007ff6`e9de3000   notepad    (pdb symbols)          c:\symbols\notepad.pdb\2352C62CDF448257FDBDDA4081A8F9081\notepad.pdb
    Loaded symbol image file: C:\Windows\system32\notepad.exe
    Image path: C:\Windows\system32\notepad.exe
    Image name: notepad.exe
    Image was built with /Brepro flag.
    Timestamp:        329A7791 (This is a reproducible build file hash, not a timestamp)
    CheckSum:         0004D15F
    ImageSize:        00043000
    File version:     10.0.17763.1
    Product version:  10.0.17763.1
    File flags:       0 (Mask 3F)
    File OS:          40004 NT Win32
    File type:        1.0 App
    File date:        00000000.00000000
    Translations:     0409.04b0
    CompanyName:      Microsoft Corporation
    ProductName:      Microsoft??? Windows??? Operating System
    InternalName:     Notepad
    OriginalFilename: NOTEPAD.EXE
    ProductVersion:   10.0.17763.1
    FileVersion:      10.0.17763.1 (WinBuild.160101.0800)
    FileDescription:  Notepad
    LegalCopyright:   ??? Microsoft Corporation. All rights reserved.
...
00007ffe`9ccb0000 00007ffe`9ce9d000   ntdll      (pdb symbols)          c:\symbols\ntdll.pdb\B8AD79538F2730FD9BACE36C9F9316A01\ntdll.pdb
    Loaded symbol image file: C:\WINDOWS\SYSTEM32\ntdll.dll
    Image path: C:\WINDOWS\SYSTEM32\ntdll.dll
    Image name: ntdll.dll
    Image was built with /Brepro flag.
    Timestamp:        E8B54827 (This is a reproducible build file hash, not a timestamp)
    CheckSum:         001F20D1
    ImageSize:        001ED000
    File version:     10.0.17763.194
    Product version:  10.0.17763.194
    File flags:       0 (Mask 3F)
    File OS:          40004 NT Win32
    File type:        2.0 Dll
    File date:        00000000.00000000
    Translations:     0409.04b0
    CompanyName:      Microsoft Corporation
    ProductName:      Microsoft??? Windows??? Operating System
    InternalName:     ntdll.dll
    OriginalFilename: ntdll.dll
    ProductVersion:   10.0.17763.194
    FileVersion:      10.0.17763.194 (WinBuild.160101.0800)
    FileDescription:  NT Layer DLL
    LegalCopyright:   ??? Microsoft Corporation. All rights reserved.
quit:

The output starts with info that I’m not interested in here. I only want to get the detailed information about the loaded modules. It is not until the line

start             end                 module name

that I care about the output.

Also, at the end there is a line that we need to be aware of:

quit:

that is not part of the module output.

To skip the parts of the debugger output that we don’t care about, we have a boolean flag initially set to true.
If that flag is set, we check if the current line, $_, is the module header in which case we flip the flag.

    $inPreamble = $true
    switch -regex ($cdbOutput) {

        { $inPreamble -and $_ -eq "start             end                 module name" } { $inPreamble = $false; continue }

I have made the parser a separate function that reads its input from the pipeline. This way, I can use the same function to parse module data, regardless of how I got the module data. Maybe it was saved on a file. Or came from a dump, or a live process. It doesn’t matter, since the parser is decoupled from the data retrieval.

After the sample, there is a breakdown of the more complicated regular expressions used, so don’t despair if you don’t understand them at first.
Regular Expressions are notoriously hard to read, so much so that they make Perl look readable in comparison.

# define an class to store the data
class ExecutableModule {
    [string]   $Name
    [string]   $Start
    [string]   $End
    [string]   $SymbolStatus
    [string]   $PdbPath
    [bool]     $Reproducible
    [string]   $ImagePath
    [string]   $ImageName
    [DateTime] $TimeStamp
    [uint32]   $FileHash
    [uint32]   $CheckSum
    [uint32]   $ImageSize
    [version]  $FileVersion
    [version]  $ProductVersion
    [string]   $FileFlags
    [string]   $FileOS
    [string]   $FileType
    [string]   $FileDate
    [string[]] $Translations
    [string]   $CompanyName
    [string]   $ProductName
    [string]   $InternalName
    [string]   $OriginalFilename
    [string]   $ProductVersionStr
    [string]   $FileVersionStr
    [string]   $FileDescription
    [string]   $LegalCopyright
    [string]   $LegalTrademarks
    [string]   $LoadedImageFile
    [string]   $PrivateBuild
    [string]   $Comments
}

<#
.SYNOPSIS Runs a debugger on a program to dump its loaded modules
#>
function Get-ExecutableModuleRawData {
    param ([string] $Program)
    $consoleEncoding = [Console]::OutputEncoding
    [Console]::OutputEncoding = [System.Text.Encoding]::GetEncoding("iso-8859-1")
    try {
        $proc = Start-Process $program -PassThru
        Start-Sleep -Seconds 1  # sleep for a while so modules are loaded
        cdb -y srv*c:\symbols*http://msdl.microsoft.com/download/symbols -c ".reload -f;lmv;q" -p $proc.Id
        $proc.Close()
    }
    finally {
        [Console]::OutputEncoding = $consoleEncoding
    }
}

<#
.SYNOPSIS Converts verbose module data from windows debuggers into ExecutableModule objects.
#>
function ConvertTo-ExecutableModule {
    [OutputType([ExecutableModule])]
    param (
        [Parameter(ValueFromPipeline)]
        [string[]] $ModuleRawData
    )
    begin {
        $currentObject = $null
        $preamble = $true
        $propertyNameMap = @{
            'File flags'      = 'FileFlags'
            'File OS'         = 'FileOS'
            'File type'       = 'FileType'
            'File date'       = 'FileDate'
            'File version'    = 'FileVersion'
            'Product version' = 'ProductVersion'
            'Image path'      = 'ImagePath'
            'Image name'      = 'ImageName'
            'FileVersion'     = 'FileVersionStr'
            'ProductVersion'  = 'ProductVersionStr'
        }
    }
    process {
        switch -regex ($ModuleRawData) {

            # skip lines until we get to our sentinel line
            { $preamble -and $_ -eq "start             end                 module name" } { $preamble = $false; continue }

            #00007ff6`e9da0000 00007ff6`e9de3000   notepad    (deferred)
            #00007ffe`9ccb0000 00007ffe`9ce9d000   ntdll      (pdb symbols)          c:\symbols\ntdll.pdb\B8AD79538F2730FD9BACE36C9F9316A01\ntdll.pdb
            '^([0-9a-f`]{17})\s([0-9a-f`]{17})\s+(\S+)\s+\(([^\)]+)\)\s*(.+)?' {
                # see breakdown of the expression later in the post
                # on record start, output the currentObject, if any is set
                if ($null -ne $currentObject) {
                    $currentObject
                }
                $start, $end, $module, $pdbKind, $pdbPath = $matches[1..5]
                # create an instance of the object that we are adding info from the current record into.
                $currentObject = [ExecutableModule] @{
                    Start        = $start
                    End          = $end
                    Name         = $module
                    SymbolStatus = $pdbKind
                    PdbPath      = $pdbPath
                }
                continue
            }
            '^\s+Image was built with /Brepro flag.' {
                $currentObject.Reproducible = $true
                continue
            }
            '^\s+Timestamp:\s+[^\(]+\((?<timestamp>.{8})\)' {
                # see breakdown of the regular  expression later in the post
                # Timestamp:        Mon Jan  7 23:42:30 2019 (5C33D5D6)
                $intValue = [Convert]::ToInt32($matches.timestamp, 16)
                $currentObject.TimeStamp = [DateTime]::new(1970, 01, 01, 0, 0, 0, [DateTimeKind]::Utc).AddSeconds($intValue)
                continue
            }
            '^\s+TimeStamp:\s+(?<value>.{8}) \(This' {
                # Timestamp:        E78937AC (This is a reproducible build file hash, not a timestamp)
                $currentObject.FileHash = [Convert]::ToUInt32($matches.value, 16)
                continue
            }
            '^\s+Loaded symbol image file: (?<imageFile>[^\)]+)' {
                $currentObject.LoadedImageFile = $matches.imageFile
                continue
            }
            '^\s+Checksum:\s+(?<checksum>\S+)' {
                $currentObject.Checksum = [Convert]::ToUInt32($matches.checksum, 16)
                continue
            }
            '^\s+Translations:\s+(?<value>\S+)' {
                $currentObject.Translations = $matches.value.Split(".")
                continue
            }
            '^\s+ImageSize:\s+(?<imageSize>.{8})' {
                $currentObject.ImageSize = [Convert]::ToUInt32($matches.imageSize, 16)
                continue
            }
            '^\s{4}(?<name>[^:]+):\s+(?<value>.+)' {
                # see breakdown of the regular expression later in the post
                # This part is any 'name: value' pattern
                $name, $value = $matches['name', 'value']

                # project the property name
                $propName = $propertyNameMap[$name]
                $propName = if ($null -eq $propName) { $name } else { $propName }

                # note the dynamic property name in the assignment
                # this will fail if the property doesn't have a member with the specified name
                $currentObject.$propName = $value
                continue
            }
            'quit:' {
                # ignore and exit
                break
            }
            default {
                # When writing the parser, it can be useful to include a line like the one below to see the cases that are not handled by the parser
                # Write-Warning "missing case for '$_'. Unexpected output format from cdb.exe"

                continue # skip lines that doesn't match the patterns we are interested in, like the start/end/modulename header and the quit: output
            }
        }
    }
    end {
        # this is needed to output the last object
        if ($null -ne $currentObject) {
            $currentObject
        }
    }
}


Get-ExecutableModuleRawData Notepad |
    ConvertTo-ExecutableModule |
    Sort-Object ProductVersion, Name
    Format-Table -Property Name, FileVersion, Product_Version, FileDescription
Name               FileVersionStr                             ProductVersion FileDescription
----               --------------                             -------------- ---------------
PROPSYS            7.0.17763.1 (WinBuild.160101.0800)         7.0.17763.1    Microsoft Property System
ADVAPI32           10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Advanced Windows 32 Base API
bcrypt             10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Windows Cryptographic Primitives Library
...
uxtheme            10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Microsoft UxTheme Library
win32u             10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Win32u
WINSPOOL           10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Windows Spooler Driver
KERNELBASE         10.0.17763.134 (WinBuild.160101.0800)      10.0.17763.134 Windows NT BASE API Client DLL
wintypes           10.0.17763.134 (WinBuild.160101.0800)      10.0.17763.134 Windows Base Types DLL
SHELL32            10.0.17763.168 (WinBuild.160101.0800)      10.0.17763.168 Windows Shell Common Dll
...
windows_storage    10.0.17763.168 (WinBuild.160101.0800)      10.0.17763.168 Microsoft WinRT Storage API
CoreMessaging      10.0.17763.194                             10.0.17763.194 Microsoft CoreMessaging Dll
gdi32full          10.0.17763.194 (WinBuild.160101.0800)      10.0.17763.194 GDI Client DLL
ntdll              10.0.17763.194 (WinBuild.160101.0800)      10.0.17763.194 NT Layer DLL
RMCLIENT           10.0.17763.194 (WinBuild.160101.0800)      10.0.17763.194 Resource Manager Client
RPCRT4             10.0.17763.194 (WinBuild.160101.0800)      10.0.17763.194 Remote Procedure Call Runtime
combase            10.0.17763.253 (WinBuild.160101.0800)      10.0.17763.253 Microsoft COM for Windows
COMCTL32           6.10 (WinBuild.160101.0800)                10.0.17763.253 User Experience Controls Library
urlmon             11.00.17763.168 (WinBuild.160101.0800)     11.0.17763.168 OLE32 Extensions for Win32
iertutil           11.00.17763.253 (WinBuild.160101.0800)     11.0.17763.253 Run time utility for Internet Explorer

Regex pattern breakdown

Here is a breakdown of the more complicated patterns, using the ignore pattern whitespace modifier x:

([0-9a-f`]{17})\s([0-9a-f`]{17})\s+(\S+)\s+\(([^\)]+)\)\s*(.+)?

# example input: 00007ffe`9ccb0000 00007ffe`9ce9d000   ntdll      (pdb symbols)          c:\symbols\ntdll.pdb\B8AD79538F2730FD9BACE36C9F9316A01\ntdll.pdb

(?x)                # ignore pattern whitespace
^                   # the beginning of the line
([0-9a-f`]{17})     # capture expression like 00007ff6`e9da0000 - any hex number or backtick, and exactly 17 of them
\s                  # a space
([0-9a-f`]{17})     # capture expression like 00007ff6`e9da0000 - any hex number or backtick, and exactly 17 of them
\s+                 # skip any number of spaces
(\S+)               # capture until we get a space - this would match the 'ntdll' part
\s+                 # skip one or more spaces
\(                  # start parenthesis
    ([^\)])         # capture anything but end parenthesis
\)                  # end parenthesis
\s*                 # skip zero or more spaces
(.+)?               # optionally capture any symbol file path

Breakdown of the name-value pattern:

^\s+(?<name>[^:]+):\s+(?<value>.+)

# example input:  File flags:       0 (Mask 3F)

(?x)                # ignore pattern whitespace
^                   # the beginning of the line
\s+                 # require one or more spaces
(?<name>[^:]+)      # capture anything that is not a `:` into the named group "name"
:                   # require a comma
\s+                 # require one or more spaces
(?<value>.+)        # capture everything until the end into the name group "value"

Breakdown of the timestamp pattern:

^\s{4}Timestamp:\s+[^\(]+\((?<timestamp>.{8})\)

#example input:     Timestamp:        Mon Jan  7 23:42:30 2019 (5C33D5D6)

(?x)                # ignore pattern whitespace
^                   # the beginning of the line
\s+                 # require one or more spaces
Timestamp:          # The literal text 'Timestamp:'
\s+                 # require one or more spaces
[^\(]+              # one or more of anything but a open parenthesis
\(                  # a literal '('
(?<timestamp>.{8})  # 8 characters of anything, captured into the group 'timestamp'
\)                  # a literal ')'

Gotchas – the Regex Cache

Something that can happen if you are writing a more complicated parser is the following:
The parser works well. You have 15 regular expressions in your switch statement and then you get some input you haven’t seen before, so you add a 16th regex.
All of a sudden, the performance of your parser tanks. WTF?

The .net regex implementation has a cache of recently used regexs. You can check the size of it like this:

PS> [regex]::CacheSize
15

# bump it
[regex]::CacheSize = 20

And now your parser is fast(er) again.

Bonus tip

I frequently use PowerShell to write (generate) my code:

Get-ExecutableModuleRawData pwsh |
    Select-String '^\s+([^:]+):' |       # this pattern matches the module detail fields
    Foreach-Object {$_.matches.groups[1].value} |
    Select-Object -Unique |
    Foreach-Object -Begin   { "class ExecutableModuleData {" }`
                   -Process { "    [string] $" + ($_ -replace "\s.", {[char]::ToUpperInvariant($_.Groups[0].Value[1])}) }`
                   -End     { "}" }

The output is

class ExecutableModuleData {
    [string] $LoadedSymbolImageFile
    [string] $ImagePath
    [string] $ImageName
    [string] $Timestamp
    [string] $CheckSum
    [string] $ImageSize
    [string] $FileVersion
    [string] $ProductVersion
    [string] $FileFlags
    [string] $FileOS
    [string] $FileType
    [string] $FileDate
    [string] $Translations
    [string] $CompanyName
    [string] $ProductName
    [string] $InternalName
    [string] $OriginalFilename
    [string] $ProductVersion
    [string] $FileVersion
    [string] $FileDescription
    [string] $LegalCopyright
    [string] $Comments
    [string] $LegalTrademarks
    [string] $PrivateBuild
}

It is not complete – I don’t have the fields from the record start, some types are incorrect and when run against some other executables a few other fields may appear.
But it is a very good starting point. And way more fun than typing it 🙂

Note that this example is using a new feature of the -replace operator – to use a ScriptBlock to determine what to replace with – that was added in PowerShell Core 6.1.

Bonus tip #2

A regular expression construct that I often find useful is non-greedy matching.
The example below shows the effect of the ? modifier, that can be used after * (zero or more) and + (one or more)

# greedy matching - match to the last occurrence of the following character (>)
if("<Tag>Text</Tag>" -match '<(.+)>') { $matches }
Name                           Value
----                           -----
1                              Tag>Text</Tag
0                              <Tag>Text</Tag>
# non-greedy matching - match to the first occurrence of the the following character (>)
if("<Tag>Text</Tag>" -match '<(.+?)>') { $matches }
Name                           Value
----                           -----
1                              Tag
0                              <Tag>

See Regex Repeat for more info on how to control pattern repetition.

Summary

In this post, we have looked at how the structure of a switch-based parser could look, and how it can be written so that it works as a part of the pipeline.
We have also looked at a few slightly more complicated regular expressions in some detail.

As we have seen, PowerShell has a plethora of options for parsing text, and most of them revolve around regular expressions.
My personal experience has been that the time I’ve invested in understanding the regex language was well invested.

Hopefully, this gives you a good start with the parsing tasks you have at hand.

Thanks to Jason Shirk, Mathias Jessen and Steve Lee for reviews and feedback.

Staffan Gustafsson, @StaffanGson, github

Staffan works at DICE in Stockholm, Sweden, as a Software Engineer and has been using PowerShell since the first public beta.
He was most seriously pleased when PowerShell was open sourced, and has since contributed bug fixes, new features and performance improvements.
Staffan is a speaker at PSConfEU and is always happy to talk PowerShell.

PowerShell Core Release Improvements

$
0
0

Overview

For PowerShell Core, we basically had to build a new engineering system to build and release it. How we build it has evolved over time as we learn and our other teams have implemented features that make some tasks easier. We are finally at a state that we believe we can engineer a system that builds PowerShell Core for release with as little human interaction as necessary.

Current state

Before the changes described here, we had one build per platform. After the binaries were built, they had to be tested and then packaged into the various packages for release. This is done in a private Azure DevOps Pipelines instance. In this state, it took a good deal of people’s time to do a PowerShell Core release. Before these changes, it would take 3-4 people about a week to release PowerShell Core. During this time, the percentage of time people were focused on the release probably averaged 50%.

Goals

  1. Remain compliant with Microsoft and external standards we are required to follow.
  2. Automate as much of the build, test, and release process as possible.
    • This should significantly reduce the amount of human toil needed in each release.
  3. Hopefully, provide some tools or practices others can follow.

What we have done so far

  1. We ported our CI tests to Azure DevOps Pipelines.
    • We have used this in a release and we see that this allowed us to run at least those test in our private Azure DevOps Pipelines instance.
    • This saves us 2-4 man hours per release and a day or more of calendar time if all goes well.
  2. We have moved our release build definitions to YAML.
    • We have used this in a release and we see that this allows us to treat the release build as code and iterate more quickly.
    • This saves us 1-2 man hours per release, when we have done everything correctly.
  3. I have begun to merge the the different platform builds into one combined build.
    • We have not yet used this in a release but we believe this should allow us to have a single button that gets us ready to test.
    • This has not been in use long enough to determine how much time it will save.
  4. We have begun to automate our releases testing. Our release testing is very similar to our CI testing just across more distributions and versions of Windows. We plan to be able to run this through Azure DevOps Pipelines as well.
    • This has not been in use long enough to determine how much time it will save.
  5. We have automated the generation of the draft of the change log and categorizing the entries based on labels the maintainers apply to the PRs. After generation, the maintainers still need to review the change descriptions to make sure it makes sense in the change log.
    • This saves us 2-4 man hours per release.

Summary of improvements

After all these changes, we can now release with 2-3 people in 2 to 3 days, with an average of 25% time focusing on the release.

Details of the combined build

Azure DevOps Pipelines allows us to define complex build pipeline. The build will be complex but things like templates in Azure DevOps makes breaking it into a manageable pieces.

Although this design does not technically reduce the number of parts, one significant thing it does for us it put all of our artifacts, in one place. Having the artifacts in one place, reduces the input to the steps in the rest of the build such as test and release.

I’m not going to discuss it much, but in order to coordinate this work we are keeping diagram of the build. I’ll include it here. If you want me to post another blog on the details, please leave a comment.

diagram

What is left to do

  1. We still have to add the other various NuGet package build steps to the coordinated build.
  2. We need to automate functionality (CI tests) across a representative sample of supported platforms.
  3. It would be nice if we could enforce in GitHub the process that helps us automate the change log generation.
  4. We need to automate the release process including:
    • Automating package testing. For example, MSI, Zip, Deb, RPM, and Snap.
    • Automating the actual release to GitHub, mcr.microsoft.com, packages.microsoft.com and the Snap store.

Travis Plunk
Senior Software Engineer
PowerShell Team

The post PowerShell Core Release Improvements appeared first on PowerShell.

Using PSScriptAnalyzer to check PowerShell version compatibility

$
0
0

PSScriptAnalyzer version 1.18 was released recently, and ships with powerful new rules that can check PowerShell scripts for incompatibilities with other PowerShell versions and environments.

In this blog post, the first in a series, we’ll see how to use these new rules to check a script for problems running on PowerShell 3, 5.1 and 6.

Wait, what’s PSScriptAnalyzer?

PSScriptAnalzyer is a module providing static analysis (or linting) and some dynamic analysis (based on the state of your environment) for PowerShell. It’s able to find problems and fix bad habits in PowerShell scripts as you create them, similar to the way the C# compiler will give you warnings and find errors in C# code before it’s executed.

If you use the VSCode PowerShell extension, you might have seen the “green squigglies” and problem reports that PSScriptAnalyzer generates for scripts you author:

Image of PSScriptAnalyzer linting in VSCode with green squigglies

You can install PSScriptAnalyzer to use on your own scripts with:

Install-Module PSScriptAnalyzer -Scope CurrentUser

PSScriptAnalyzer works by running a series of rules on your scripts, each of which independently assesses some issue. For example AvoidUsingCmdletAliases checks that aliases aren’t used in scripts, and MisleadingBackticks checks that backticks at the ends of lines aren’t followed by whitespace.

For more information, see the PSScriptAnalyzer deep dive blog series.

Introducing the compatibility check rules

The new compatibility checking functionality is provided by three new rules:

  • PSUseCompatibleSyntax, which checks whether a syntax used in a script will work in other PowerShell versions.
  • PSUseCompatibleCommands, which checks whether commands used in a script are available in other PowerShell environments.
  • PSUseCompatibleTypes, which checks whether .NET types and static methods/properties are available in other PowerShell environments.

The syntax check rule simply requires a list of PowerShell versions you want to target, and will tell you if a syntax used in your script won’t work in any of those versions.

The command and type checking rules are more sophisticated and rely on profiles (catalogs of commands and types available) from commonly used PowerShell platforms. They require configuration to use these profiles, which we’ll go over below.

For this post, we’ll look at configuring and using PSUseCompatibleSyntax and PSUseCompatibleCommands to check that a script works with different versions of PowerShell. We’ll look at PSUseCompatibleTypes in a later post, although it’s configured very similarly to PSUseCompatibleCommands.

Working example: a small PowerShell script

Imagine we have a small (and contrived) archival script saved to .\archiveScript.ps1:

# Import helper module to get folders to archive
Import-Module -FullyQualifiedName @{ ModuleName = ArchiveHelper; ModuleVersion = '1.1' }

$paths = Get-FoldersToArchive -RootPath 'C:\Documents\DocumentsToArchive\'
$archiveBasePath = '\\ArchiveServer\DocumentArchive\'

# Dictionary to collect hashes
$hashes = [System.Collections.Generic.Dictionary[string, string]]::new()
foreach ($path in $paths)
{
    # Get the hash of the file and turn it into a base64 string
    $hash = (Get-FileHash -LiteralPath $path).Hash

    # Add this file to the hash catalog
    $hashes[$hash] = $path

    # Now give the archive a unique name and zip it up
    $name = Split-Path -LeafBase $path
    Compress-Archive -LiteralPath $path -DestinationPath (Join-Path $archiveBasePath "$name-$hash.zip")
}

# Add the hash catalog to the archive directory
ConvertTo-Json $hashes | Out-File -LiteralPath (Join-Path $archiveBasePath "catalog.json") -NoNewline

This script was written in PowerShell 6.2, and we’ve tested that it works there. But we also want to run it on other machines, some of which run PowerShell 5.1 and some of which run PowerShell 3.0.

Ideally we will test it on those other platforms, but it would be nice if we could try to iron out as many bugs as possible ahead of time.

Checking syntax with PSUseCompatibleSyntax

The first and easiest rule to apply is PSUseCompatibleSyntax. We’re going to create some settings for PSScriptAnalyzer to enable the rule, and then run analysis on our script to get back any diagnostics about compatibility.

Running PSScriptAnalyzer is straightforward. It comes as a PowerShell module, so once it’s installed on your module path you just invoke it on your file with Invoke-ScriptAnalyzer, like this:

Invoke-ScriptAnalyzer -Path '.\archiveScript.ps1`

A very simple invocation like this one will run PSScriptAnalyzer using its default rules and configurations on the script you point it to.

However, because they require more targeted configuration, the compatibility rules are not enabled by default. Instead, we need to supply some settings to run the syntax check rule. In particular, PSUseCompatibleSyntax requires a list of the PowerShell versions you are targeting with your script.

$settings = @{
    Rules = @{
        PSUseCompatibleSyntax = @{
            # This turns the rule on (setting it to false will turn it off)
            Enable = $true

            # List the targeted versions of PowerShell here
            TargetVersions = @(
                '3.0',
                '5.1',
                '6.2'
            )
        }
    }
}

Invoke-ScriptAnalyzer -Path .\archiveScript.ps1 -Settings $settings

Running this will present us with the following output:

RuleName                            Severity     ScriptName Line  Message
--------                            --------     ---------- ----  -------
PSUseCompatibleSyntax               Warning      archiveScr 8     The constructor syntax
                                                 ipt.ps1          '[System.Collections.Generic.Dictionary[string,
                                                                  string]]::new()' is not available by default in
                                                                  PowerShell versions 3,4

This is telling us that the [dictionary[string, string]]::new() syntax we used won’t work in PowerShell 3. Better than that, in this case the rule has actually proposed a fix:

$diagnostics = Invoke-ScriptAnalyzer -Path .\archiveScript.ps1 -Settings $settings
$diagnostics[0].SuggestedCorrections

File              : C:\Users\roholt\Documents\Dev\sandbox\VersionedScript\archiveScript.ps1
Description       : Use the 'New-Object @($arg1, $arg2, ...)' syntax instead for compatibility with PowerShell versions 3,4
StartLineNumber   : 8
StartColumnNumber : 11
EndLineNumber     : 8
EndColumnNumber   : 73
Text              : New-Object 'System.Collections.Generic.Dictionary[string,string]'
Lines             : {New-Object 'System.Collections.Generic.Dictionary[string,string]'}
Start             : Microsoft.Windows.PowerShell.ScriptAnalyzer.Position
End               : Microsoft.Windows.PowerShell.ScriptAnalyzer.Position

The suggested correction is to use New-Object instead. The way this is suggested might seem slightly unhelpful here with all the position information, but we’ll see later why this is useful.

This dictionary example is a bit artificial of course (since a hashtable would come more naturally), but having a spanner thrown into the works in PowerShell 3 or 4 because of a ::new() is not uncommon. The PSUseCompatibleSyntax rule will also warn you about classes, workflows and using statements depending on the versions of PowerShell you’re authoring for.

We’re not going to make the suggested change just yet, since there’s more to show you first.

Checking command usage with PSUseCompatibleCommands

We now want to check the commands. Because command compatibility is a bit more complicated than syntax (since the availability of commands depends on more than what version of PowerShell is being run), we have to target profiles instead.

Profiles are catalogs of information taken from stock machines running common PowerShell environments. The ones shipped in PSScriptAnalyzer can’t always match your working environment perfectly, but they come pretty close (there’s also a way to generate your own profile, detailed in a later blog post). In our case, we’re trying to target PowerShell 3.0, PowerShell 5.1 and PowerShell 6.2 on Windows. We have the first two profiles, but in the last case we’ll need to target 6.1 instead. These targets are very close, so warnings will still be pertinent to using PowerShell 6.2. Later when a 6.2 profile is made available, we’ll be able to switch over to that.

We need to look under the PSUseCompatibleCommands documentation for a list of profiles available by default. For our desired targets we pick:

  • PowerShell 6.1 on Windows Server 2019 (win-8_x64_10.0.17763.0_6.1.3_x64_4.0.30319.42000_core)
  • PowerShell 5.1 on Windows Server 2019 (win-8_x64_10.0.17763.0_5.1.17763.316_x64_4.0.30319.42000_framework)
  • PowerShell 3.0 on Windows Server 2012 (win-8_x64_6.2.9200.0_3.0_x64_4.0.30319.42000_framework)

The long names on the right are canonical profile identifiers, which we use in the settings:

$settings = @{
    Rules = @{
        PSUseCompatibleCommands = @{
            # Turns the rule on
            Enable = $true

            # Lists the PowerShell platforms we want to check compatibility with
            TargetProfiles = @(
                'win-8_x64_10.0.17763.0_6.1.3_x64_4.0.30319.42000_core',
                'win-8_x64_10.0.17763.0_5.1.17763.316_x64_4.0.30319.42000_framework',
                'win-8_x64_6.2.9200.0_3.0_x64_4.0.30319.42000_framework'
            )
        }
    }
}

Invoke-ScriptAnalyzer -Path ./archiveScript.ps1 -Settings $settings

There might be a delay the first time you execute this because the rules have to load the catalogs into a cache. Each catalog of a PowerShell platform contains details of all the modules and .NET assemblies available to PowerShell on that platform, which can be as many as 1700 commands with 15,000 parameters and 100 assemblies with 10,000 types. But once it’s loaded, further compatibility analysis will be fast. We get output like this:

RuleName                            Severity     ScriptName Line  Message
--------                            --------     ---------- ----  -------
PSUseCompatibleCommands             Warning      archiveScr 2     The parameter 'FullyQualifiedName' is not available for
                                                 ipt.ps1          command 'Import-Module' by default in PowerShell version
                                                                  '3.0' on platform 'Microsoft Windows Server 2012
                                                                  Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 12    The command 'Get-FileHash' is not available by default in
                                                 ipt.ps1          PowerShell version '3.0' on platform 'Microsoft Windows
                                                                  Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 18    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version
                                                                  '5.1.17763.316' on platform 'Microsoft Windows Server
                                                                  2019 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 18    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 19    The command 'Compress-Archive' is not available by
                                                 ipt.ps1          default in PowerShell version '3.0' on platform
                                                                  'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 23    The parameter 'NoNewline' is not available for command
                                                 ipt.ps1          'Out-File' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'

This is telling us that:

  • Import-Module doesn’t support -FullyQualifiedName in PowerShell 3.0;
  • Get-FileHash doesn’t exist in PowerShell 3.0;
  • Split-Path doesn’t have -LeafBase in PowerShell 5.1 or PowerShell 3.0;
  • Compress-Archive isn’t available in PowerShell 3.0, and;
  • Out-File doesn’t support -NoNewline in PowerShell 3.0

One thing you’ll notice is that the Get-FoldersToArchive function is not being warned about. This is because the compatibility rules are designed to ignore user-provided commands; a command will only be marked as incompatible if it’s present in some profile and not in one of your targets.

Again, we can change the script to fix these warnings, but before we do, I want to show you how to make this a more continuous experience; as you change your script, you want to know if the changes you make break compatibility, and that’s easy to do with the steps below.

Using a settings file for repeated invocation

The first thing we want is to make the PSScriptAnalyzer invocation more automated and reproducible. A nice step toward this is taking the settings hashtable we made and turning it into a declarative data file, separating out the “what” from the “how”.

PSScriptAnalyzer will accept a path to a PSD1 in the -Settings parameter, so all we need to do is turn our hashtable into a PSD1 file, which we’ll make ./PSScriptAnalyzerSettings.psd1. Notice we can merge the settings for both PSUseCompatibleSyntax and PSUseCompatibleCommands:

# PSScriptAnalyzerSettings.psd1
# Settings for PSScriptAnalyzer invocation.
@{
    Rules = @{
        PSUseCompatibleCommands = @{
            # Turns the rule on
            Enable = $true

            # Lists the PowerShell platforms we want to check compatibility with
            TargetProfiles = @(
                'win-8_x64_10.0.17763.0_6.1.3_x64_4.0.30319.42000_core',
                'win-8_x64_10.0.17763.0_5.1.17763.316_x64_4.0.30319.42000_framework',
                'win-8_x64_6.2.9200.0_3.0_x64_4.0.30319.42000_framework'
            )
        }
        PSUseCompatibleSyntax = @{
            # This turns the rule on (setting it to false will turn it off)
            Enable = $true

            # Simply list the targeted versions of PowerShell here
            TargetVersions = @(
                '3.0',
                '5.1',
                '6.2'
            )
        }
    }
}

Now we can run the PSScriptAnalyzer again on the script using the settings file:

Invoke-ScriptAnalyzer -Path ./archiveScript.ps1 -Settings ./PSScriptAnalyzerSettings.psd1

This gives the output:

RuleName                            Severity     ScriptName Line  Message
--------                            --------     ---------- ----  -------
PSUseCompatibleCommands             Warning      archiveScr 1     The parameter 'FullyQualifiedName' is not available for
                                                 ipt.ps1          command 'Import-Module' by default in PowerShell version
                                                                  '3.0' on platform 'Microsoft Windows Server 2012
                                                                  Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 9     The command 'Get-FileHash' is not available by default in
                                                 ipt.ps1          PowerShell version '3.0' on platform 'Microsoft Windows
                                                                  Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 12    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 12    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version
                                                                  '5.1.17763.316' on platform 'Microsoft Windows Server
                                                                  2019 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 13    The command 'Compress-Archive' is not available by
                                                 ipt.ps1          default in PowerShell version '3.0' on platform
                                                                  'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 16    The parameter 'NoNewline' is not available for command
                                                 ipt.ps1          'Out-File' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleSyntax               Warning      archiveScr 6     The constructor syntax
                                                 ipt.ps1          '[System.Collections.Generic.Dictionary[string,
                                                                  string]]::new()' is not available by default in
                                                                  PowerShell versions 3,4

Now we don’t depend on any variables anymore, and have a separate spefication of the analysis you want. Using this, you could put this into continuous integration environments for example to check that changes in scripts don’t break compatibility.

But what we really want is to know that PowerShell scripts stay compatible as you edit them. That’s what the settings file is building to, and also where it’s easiest to make the changes you need to make your script compatible. For that, we want to integrate with the VSCode PowerShell extension.

Integrating with VSCode for on-the-fly compatibility checking

As explained at the start of this post, VSCode PowerShell extension has builtin support for PSScriptAnalyzer. In fact, as of version 1.12.0, the PowerShell extension ships with PSScriptAnalyzer 1.18, meaning you don’t need to do anything other than create a settings file to do compatibility analysis.

We already have our settings file ready to go from the last step, so all we have to do is point the PowerShell extension to the file in the VSCode settings.

You can open the settings with Ctrl+, (use Cmd instead of Ctrl on macOS). In the Settings view, we want PowerShell > Script Analysis: Settings Path. In the settings.json view this is "powershell.scriptAnalysis.settingsPath". Entering a relative path here will find a settings file in our workspace, so we just put ./PSScriptAnalyzerSettings.psd1:

VSCode settings GUI with PSScriptAnalyzer settings path configured to "./PSScriptAnalyzerSettings.psd1"

In the settings.json view this will look like:

"powershell.scriptAnalysis.settingsPath": "./PSScriptAnalyzerSettings.psd1"

Now, opening the script in VSCode we see “green squigglies” for compatibility warnings:

VSCode window containing script, with green squigglies underneath incompatible code

In the problems pane, you’ll get a full desrciption of all the incompatibilities:

VSCode problems pane, listing and describing identified compatibility issues

Let’s fix the syntax problem first. If you remember, PSScriptAnalyzer supplies a suggested correction to this problem. VSCode integrates with PSScriptAnalyzer’s suggested corrections and can apply them if you click on the lightbulb or with Ctrl+Space when the region is under the cursor:

VSCode suggesting New-Object instead of ::new() syntax

Applying this change, the script is now:

Import-Module -FullyQualifiedName @{ ModuleName = ArchiveHelper; ModuleVersion = '1.1' }

$paths = Get-FoldersToArchive -RootPath 'C:\Documents\DocumentsToArchive\'
$archivePath = '\\ArchiveServer\DocumentArchive\'

$hashes = New-Object 'System.Collections.Generic.Dictionary[string,string]'
foreach ($path in $paths)
{
    $hash = (Get-FileHash -LiteralPath $path).Hash
    $hashes[$hash] = $path
    $name = Split-Path -LeafBase $path
    Compress-Archive -LiteralPath $path -DestinationPath (Join-Path $archivePath "$name-$hash.zip")
}

ConvertTo-Json $hashes | Out-File -LiteralPath (Join-Path $archivePath "catalog.json") -NoNewline

The other incompatibilities don’t have corrections; for now PSUseCompatibleCommands knows what commands are available on each platform, but not what to substitute with when a command isn’t available. So we just need to apply some PowerShell knowledge:

  • Instead of Import-Module -FullyQualifiedName @{...} we use Import-Module -Name ... -Version ...;
  • Instead of Get-FileHash, we’re going to need to use .NET directly and write a function;
  • Instead of Split-Path -LeafBase, we can use [System.IO.Path]::GetFileNameWithoutExtension();
  • Instead of Compress-Archive we’ll need to use more .NET methods in a function, and;
  • Instead of Out-File -NoNewline we can use New-Item -Value

We end up with something like this (the specific implementation is unimportant, but we have something that will work in all versions):

Import-Module -Name ArchiveHelper -Version '1.1'

function CompatibleGetFileHash
{
    param(
        [string]
        $LiteralPath
    )

    try
    {
        $hashAlg = [System.Security.Cryptography.SHA256]::Create()
        $file = [System.IO.File]::Open($LiteralPath, 'Open', 'Read')
        $file.Position = 0
        $hashBytes = $hashAlg.ComputeHash($file)
        return [System.BitConverter]::ToString($hashBytes).Replace('-', '')
    }
    finally
    {
        $file.Dispose()
        $hashAlg.Dispose()
    }
}

function CompatibleCompressArchive
{
    param(
        [string]
        $LiteralPath,

        [string]
        $DestinationPath
    )

    if ($PSVersion.Major -le 3)
    {
        # PSUseCompatibleTypes identifies that [System.IO.Compression.ZipFile]
        # isn't available by default in PowerShell 3 and we have to do this.
        # We'll cover that rule in the next blog post.
        Add-Type -AssemblyName System.IO.Compression.FileSystem -ErrorAction Ignore
    }

    [System.IO.Compression.ZipFile]::Create(
        $LiteralPath,
        $DestinationPath,
        'Optimal',
        <# includeBaseDirectory #> $true)
}

$paths = Get-FoldersToArchive -RootPath 'C:\Documents\DocumentsToArchive\'
$archivePath = '\\ArchiveServer\DocumentArchive\'

$hashes = New-Object 'System.Collections.Generic.Dictionary[string,string]'
foreach ($path in $paths)
{
    $hash = CompatibleGetFileHash -LiteralPath $path
    $hashes[$hash] = $path
    $name = [System.IO.Path]::GetFileNameWithoutExtension($path)
    CompatibleCompressArchive -LiteralPath $path -DestinationPath (Join-Path $archivePath "$name-$hash.zip")
}

$jsonStr = ConvertTo-Json $hashes
New-Item -Path (Join-Path $archivePath "catalog.json") -Value $jsonStr

You should notice that as you type, VSCode displays new analysis of what you’re writing and the green squigglies drop away. When we’re done we get a clean bill of health for script compatibility:

VSCode window with script and problems pane, with no green squigglies and no problems

This means you’ll now be able to use this script across all the PowerShell versions you need to target. Better, you now have a configuration in your workspace so as you write more scripts, there is continual checking for compatibility. And if your compatibility targets change, all you need to do is change your configuration file in one place to point to your desired targets, at which point you’ll get analysis for your updated target platforms.

Summary

Hopefully in this blog post you got some idea of the new compatibility rules that come with PSScriptAnalyzer 1.18.

We’ve covered how to set up and use the syntax compatibility checking rule, PSUseCompatibleSyntax, and the command checking rule, PSUseCompatibleCommands, both using a hashtable configuration and a settings PSD1 file.

We’ve also looked at using the compatibility rules in with the PowerShell extension for VSCode, where they come by default from version 1.12.0.

If you’ve got the latest release of the PowerShell extension for VSCode (1.12.1), you’ll be able to set your configuration file and instantly get compatibility checking.

In the next blog post, we’ll look at how to use these rules and PSUseCompatibleTypes (which checks if .NET types and static methods are available on target platforms) can be used to help you write scripts that work cross platform across Windows and Linux using both Windows PowerShell and PowerShell Core.


Rob Holt

Software Engineer

PowerShell Team

The post Using PSScriptAnalyzer to check PowerShell version compatibility appeared first on PowerShell.

Public Preview of PowerShell in Azure Functions 2.x

$
0
0

Over the last six months, we’ve been hard at work integrating PowerShell Core with Azure Functions 2.x. Today, I’m happy to announce that we’re releasing public preview of PowerShell support for Azure Functions 2.x for Windows (ConsumptionPremium, and App Service pricing plans).

I already know I want this, give me the good stuff!

Learn how to create your first PowerShell function in Azure Functions, or dive right into the Azure Functions PowerShell developer guide.

What’s Azure Functions?

For those that haven’t experienced the joy of using Azure Functions yet, it’s an event-based serverless platform that enables you to write code in a variety of languages without having to worry about managing infrastructure like VMs or containers. You simply create a function, give it an event trigger, write some PowerShell code, and point your return data at an output binding. That’s it!

Azure Functions supports a multitude of triggers and bindings. Commonly, folks use the HTTP (webhook) triggers to easily create REST API endpoints, but there’s a huge set that supports integration with Microsoft services like Azure and Office 365, as well as 3rd party services like GitHub, SendGrid, and Twilio.

So how does PowerShell fit in?

Today, Azure Functions supports a number of development languages like C#, JavaScript/Node, Java, and Python. While these languages are great for developing serverless applications, PowerShell is geared towards cloud and in-guest management, and as a scripting language can be much simpler to use for certain tasks, including integration with Azure and Office 365.

We built the new PowerShell worker in Azure Functions to take advantage of PowerShell’s ability to natively manage objects. In fact, we’ve already built a bot built on top of Azure Functions that helps out in PRs and issues for the PowerShell GitHub repository. Look out for a blog post in the future that talks more about what it does and how we built it.

What’s included today?

  • PowerShell Core 6.2 worker in Azure Functions 2.x
    • Windows Consumption, Dedicated and Premium pricing models
  • Creation of PowerShell function apps and functions
  • Support and templates for all 2.x triggers and input/output bindings
    with templates
  • Support for the Az modules as a managed dependency which will be kept up to date for you
  • Ability to execute a profile.ps1 on the first cold start of any Functions worker
  • Support for Visual Studio Code, including integration between the Azure Functions and PowerShell extensions
  • Ability to run the PowerShell Core worker within the func CLI on Windows, macOS, and Linux
  • Local debugging of function apps in Visual Studio Code and PowerShell Core

What’s not included today?

  • PowerShell support for Linux in Azure Functions
  • Support for Durable Functions
  • “Live objects” for trigger objects
    • Today, all trigger objects except HTTP are more generic data types
  • Generic PowerShell Gallery support for managed dependencies
  • PowerShell cmdlets for managing Azure Functions (e.g. Az.Functions)

Awesome, how do I get started?

Start with our Azure Functions PowerShell quickstart to learn how to use Visual Studio Code and the Azure Functions Core Tools to create, test, debug, and deploy your first PowerShell function app. Watch Azure Friday video to learn more about using PowerShell in Functions.

I’ve got a problem / Where do I give feedback?

We encourage you to give us feedback via our GitHub repository. Feel free to file and participate in issues for:

  • missing functionality (and why you value that functionality)
  • problems with the programming model (and how it affects your scenarios)
  • bugs in the worker
  • anything else you believe needs improving or fixing

When will this be reaching General Availability (GA)?

We don’t currently have an ETA for when PowerShell in Azure Functions will reach GA, but your feedback will help us drive our prioritization, so make sure to voice your opinions! Even if it’s simply to tell us “works great for me!”, that will help us reach our next milestones more quickly.

That’s it!

Thanks for being an early adopter! We’re really looking forward to seeing what kinds of automation you all build with serverless PowerShell!

Thanks,
Joey Aiello
PM, PowerShell Core

The post Public Preview of PowerShell in Azure Functions 2.x appeared first on PowerShell.

Viewing all 1519 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>