Quantcast
Channel: PowerShell
Viewing all 1519 articles
Browse latest View live

Public Preview of PowerShell in Azure Functions 2.x

$
0
0

Over the last six months, we’ve been hard at work integrating PowerShell Core with Azure Functions 2.x. Today, I’m happy to announce that we’re releasing public preview of PowerShell support for Azure Functions 2.x for Windows (ConsumptionPremium, and App Service pricing plans).

I already know I want this, give me the good stuff!

Learn how to create your first PowerShell function in Azure Functions, or dive right into the Azure Functions PowerShell developer guide.

What’s Azure Functions?

For those that haven’t experienced the joy of using Azure Functions yet, it’s an event-based serverless platform that enables you to write code in a variety of languages without having to worry about managing infrastructure like VMs or containers. You simply create a function, give it an event trigger, write some PowerShell code, and point your return data at an output binding. That’s it!

Azure Functions supports a multitude of triggers and bindings. Commonly, folks use the HTTP (webhook) triggers to easily create REST API endpoints, but there’s a huge set that supports integration with Microsoft services like Azure and Office 365, as well as 3rd party services like GitHub, SendGrid, and Twilio.

So how does PowerShell fit in?

Today, Azure Functions supports a number of development languages like C#, JavaScript/Node, Java, and Python. While these languages are great for developing serverless applications, PowerShell is geared towards cloud and in-guest management, and as a scripting language can be much simpler to use for certain tasks, including integration with Azure and Office 365.

We built the new PowerShell worker in Azure Functions to take advantage of PowerShell’s ability to natively manage objects. In fact, we’ve already built a bot built on top of Azure Functions that helps out in PRs and issues for the PowerShell GitHub repository. Look out for a blog post in the future that talks more about what it does and how we built it.

What’s included today?

  • PowerShell Core 6.2 worker in Azure Functions 2.x
    • Windows Consumption, Dedicated and Premium pricing models
  • Creation of PowerShell function apps and functions
  • Support and templates for all 2.x triggers and input/output bindings
    with templates
  • Support for the Az modules as a managed dependency which will be kept up to date for you
  • Ability to execute a profile.ps1 on the first cold start of any Functions worker
  • Support for Visual Studio Code, including integration between the Azure Functions and PowerShell extensions
  • Ability to run the PowerShell Core worker within the func CLI on Windows, macOS, and Linux
  • Local debugging of function apps in Visual Studio Code and PowerShell Core

What’s not included today?

  • PowerShell support for Linux in Azure Functions
  • Support for Durable Functions
  • “Live objects” for trigger objects
    • Today, all trigger objects except HTTP are more generic data types
  • Generic PowerShell Gallery support for managed dependencies
  • PowerShell cmdlets for managing Azure Functions (e.g. Az.Functions)

Awesome, how do I get started?

Start with our Azure Functions PowerShell quickstart to learn how to use Visual Studio Code and the Azure Functions Core Tools to create, test, debug, and deploy your first PowerShell function app. Watch Azure Friday video to learn more about using PowerShell in Functions.

I’ve got a problem / Where do I give feedback?

We encourage you to give us feedback via our GitHub repository. Feel free to file and participate in issues for:

  • missing functionality (and why you value that functionality)
  • problems with the programming model (and how it affects your scenarios)
  • bugs in the worker
  • anything else you believe needs improving or fixing

When will this be reaching General Availability (GA)?

We don’t currently have an ETA for when PowerShell in Azure Functions will reach GA, but your feedback will help us drive our prioritization, so make sure to voice your opinions! Even if it’s simply to tell us “works great for me!”, that will help us reach our next milestones more quickly.

That’s it!

Thanks for being an early adopter! We’re really looking forward to seeing what kinds of automation you all build with serverless PowerShell!

Thanks,
Joey Aiello
PM, PowerShell Core

The post Public Preview of PowerShell in Azure Functions 2.x appeared first on PowerShell.


DSC Resource Kit Release May 2019

$
0
0

We just released the DSC Resource Kit! This release includes updates to 14 DSC resource modules. In the past 6 weeks, 87 pull requests have been merged and 36 issues have been closed, all thanks to our amazing community!

The modules updated in this release are:

  • ActiveDirectoryCSDsc
  • CertificateDsc
  • ComputerManagementDsc
  • NetworkingDsc
  • OfficeOnlineServerDsc
  • PSDscResources
  • SharePointDsc
  • SqlServerDsc
  • StorageDsc
  • xActiveDirectory
  • xDnsServer
  • xFirefox
  • xPSDesiredStateConfiguration
  • xWebAdministration

For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

Our latest community call for the DSC Resource Kit was last Wednesday, May 8. A recording of the call is available here. You can join us for the next call at 12PM (Pacific time) on June 19 to ask questions and give feedback about your experience with the DSC Resource Kit.

The next DSC Resource Kit release will be on Wednesday, June 26.

We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

Please see our documentation here for information on the support of these resource modules.

Included in this Release

You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or CHANGELOG.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

Module Name Version Release Notes
ActiveDirectoryCSDsc 3.3.0.0
  • Remove reference to StorageDsc in README.md – fixes Issue 76.
  • Combined all ActiveDirectoryCSDsc.ResourceHelper module functions into ActiveDirectoryCSDsc.Common module and renamed to ActiveDirectoryCSDsc.CommonHelper module.
  • Opted into Common Tests “Common Tests – Validate Localization” – fixes Issue 82.
CertificateDsc 4.6.0.0
  • CertReq:
    • Added Compare-CertificateIssuer function to checks if the Certificate Issuer matches the CA Root Name.
    • Changed Compare-CertificateSubject function to return false if ReferenceSubject is null.
    • Fixed exception when Certificate with empty Subject exists in Certificate Store – fixes Issue 190.
    • Fixed bug matching existing certificate when Subject Alternate Name is specified and machine language is not en-US – fixes Issue 193.
    • Fixed bug matching existing certificate when Template Name is specified and machine language is not en-US – fixes Issue 193.
    • Changed Import-CertificateEx function to use X509Certificate2Collection instead of X509Certificate2 to support importing certificate chains
ComputerManagementDsc 6.4.0.0
  • ScheduledTask:
    • IdleWaitTimeout returned from Get-TargetResource always null – Fixes Issue 186.
    • Added BuiltInAccount Property to allow running task as one of the build in service accounts – Fixes Issue 130.
  • Refactored module folder structure to move resource to root folder of repository and remove test harness – fixes Issue 188.
  • Added a CODE_OF_CONDUCT.md with the same content as in the README.md and linked to it from README.MD instead.
  • Updated test header for all unit tests to version 1.2.4.
  • Updated test header for all imtegration to version 1.3.3.
  • Enabled example publish to PowerShell Gallery by adding gallery_api environment variable to AppVeyor.yml.
NetworkingDsc 7.2.0.0
  • NetAdapterAdvancedProperty:
    • Added support for RegistryKeyword MaxRxRing1Length and NumRxBuffersSmall – fixes Issue 387.
  • Firewall:
    • Prevent “Parameter set cannot be resolved using the specified named parameters” error when updating rule when group name is specified – fixes Issue 130 and Issue 191.
  • Opted into Common Tests “Common Tests – Validate Localization” – fixes Issue 393.
  • Combined all NetworkingDsc.ResourceHelper module functions into NetworkingDsc.Common module – fixes Issue 394.
  • Renamed all localization strings so that they are detected by “Common Tests – Validate Localization”.
  • Fixed issues with mismatched localization strings.
  • Updated all common functions with the latest versions from DSCResource.Template.
  • Fixed an issue with the helper function Test-IsNanoServer that prevented it to work. Though the helper function is not used, so this issue was not caught until now when unit tests was added.
  • Corrected style violations in NetworkingDsc.Common.
OfficeOnlineServerDsc 1.4.0.0
  • OfficeOnlineServerInstall
    • Updated resource to make sure the Windows Environment variables are loaded into the PowerShell session;
  • OfficeOnlineServerMachine
    • Updated resource to make sure the Windows Environment variables are loaded into the PowerShell session;
  • Created LICENSE file to match the Microsoft Open Source Team standard.
PSDscResources 2.11.0.0
  • Fix Custom DSC Resource Kit PSSA Rule Failures
SharePointDsc 3.4.0.0
  • SPDistributedCacheClientSettings
    • Added 15 new SharePoint 2016 parameters.
  • SPFarm
    • Implemented Null check in Get method to prevent errors
    • Add support to provision Central Administration on HTTPS
  • SPInfoPathFormsServiceConfig
    • Added the AllowEventPropagation parameter.
  • SPInstall
    • Improved logging ouput
    • Updated blocked setup file check to prevent errors when BinaryDir is a CD-ROM drive or mounted ISO
  • SPInstallLanguagePack
    • Improved logging ouput
    • Updated blocked setup file check to prevent errors when BinaryDir is a CD-ROM drive or mounted ISO
  • SPInstallPrereqs
    • Improved logging ouput
    • Added the updated check to unblock setup file if it is blocked because it is coming from a network location. This to prevent endless wait.
    • Added ability to install from a UNC path, by adding server to IE Local Intranet Zone. This will prevent an endless wait caused by security warning.
    • Fixed an issue that would prevent the resource failing a test when the prerequisites have been installed successfully on Windows Server 2019
  • SPManagedMetadataServiceApp
    • Fixed issue where Get-TargetResource method throws an error when the service app proxy does not exist and no proxy name is specified.
  • SPProductUpdate
    • Improved logging ouput
    • Updated blocked setup file check to prevent errors when SetupFile is a CD-ROM drive or mounted ISO
  • SPSearchContent Source
    • Removed check that prevents configuring an incremental schedule when using continuous crawl.
  • SPSitePropertyBag
    • Fixed issue where properties were set on the wrong level.
  • SPSubscriptionSettingsServiceApp
    • Fixed issue where the service app proxy isn’t created when it wasn’t created during initial deployment.
  • SPTrustedRootAuthority
    • Added possibility to get certificate from file.
SqlServerDsc 12.5.0.0
  • Changes to SqlServerSecureConnection
    • Updated README and added example for SqlServerSecureConnection, instructing users to use the “SYSTEM” service account instead of “LocalSystem”.
  • Changes to SqlScript
    • Correctly passes the $VerbosePreference to the helper function Invoke-SqlScript so that PRINT statements is outputted correctly when verbose output is requested, e.g Start-DscConfiguration -Verbose.
    • Added en-US localization (issue 624).
    • Added additional unit tests for code coverage.
  • Changes to SqlScriptQuery
    • Correctly passes the $VerbosePreference to the helper function Invoke-SqlScript so that PRINT statements is outputted correctly when verbose output is requested, e.g Start-DscConfiguration -Verbose.
    • Added en-US localization.
    • Added additional unit tests for code coverage.
  • Changes to SqlSetup
    • Concatenated Robocopy localization strings (issue 694).
    • Made the error message more descriptive when the Set-TargetResource function calls the Test-TargetResource function to verify the desired state.
  • Changes to SqlWaitForAG
  • Changes to SqlServerPermission
  • Changes to SqlServerMemory
    • Added en-US localization (issue 617).
    • No longer will the resource set the MinMemory value if it was provided in a configuration that also set the Ensure parameter to “Absent” (issue 1329).
    • Refactored unit tests to simplify them add add slightly more code coverage.
  • Changes to SqlServerMaxDop
  • Changes to SqlRS
    • Reporting Services are restarted after changing settings, unless $SuppressRestart parameter is set (issue 1331). $SuppressRestart will also prevent Reporting Services restart after initialization.
    • Fixed one of the error handling to use localization, and made the error message more descriptive when the Set-TargetResource function calls the Test-TargetResource function to verify the desired state. This was done prior to adding full en-US localization.
    • Fixed (issue 1258). When initializing Reporting Services, there is no need to execute InitializeReportServer CIM method, since executing SetDatabaseConnection CIM method initializes Reporting Services.
    • issue 864 SqlRs can now initialise SSRS 2017 instances
  • Changes to SqlServerLogin
    • Added en-US localization (issue 615).
    • Added unit tests to improved code coverage.
  • Changes to SqlWindowsFirewall
  • Changes to SqlServerEndpoint
  • Changes to SqlServerEndpointPermission
  • Changes to SqlServerEndpointState
  • Changes to SqlDatabaseRole
  • Changes to SqlDatabaseRecoveryModel
  • Changes to SqlDatabasePermission
  • Changes to SqlDatabaseOwner
  • Changes to SqlDatabase
  • Changes to SqlAGListener
  • Changes to SqlAlwaysOnService
  • Changes to SqlAlias
    • Added en-US localization (issue 602).
    • Removed ShouldProcess for the code, since it has no purpose in a DSC resource (issue 242).
  • Changes to SqlServerReplication
    • Added en-US localization (issue 620).
    • Refactored Get-TargetResource slightly so it provide better verbose messages.
StorageDsc 4.7.0.0
  • DiskAccessPath:
    • Added a Get-Partition to properly handle setting the NoDefaultDriveLetter parameter – fixes Issue 198.
xActiveDirectory 2.26.0.0
  • Changes to xActiveDirectory
    • Added localization module -DscResource.LocalizationHelper* containing the helper functions Get-LocalizedData, New-InvalidArgumentException, New-InvalidOperationException, New-ObjectNotFoundException, and New-InvalidResultException (issue 257). For more information around these helper functions and localization in resources, see Localization section in the Style Guideline.
    • Added common module DscResource.Common containing the helper function Test-DscParameterState. The goal is that all resource common functions are moved to this module (functions that are or can be used by more than one resource) (issue 257).
    • Added xADManagedServiceAccount resource to manage Managed Service Accounts (MSAs). Andrew Wickham (@awickham10) and @kungfu71186
    • Removing the Misc Folder, as it is no longer required.
    • Added xADKDSKey resource to create KDS Root Keys for gMSAs. @kungfu71186
    • Combined DscResource.LocalizationHelper and DscResource.Common Modules into xActiveDirectory.Common
  • Changes to xADReplicationSiteLink
    • Make use of the new localization helper functions.
  • Changes to xAdDomainController
    • Added new parameter to disable or enable the Global Catalog (GC) (issue 75). Eric Foskett @Merto410
    • Fixed a bug with the parameter InstallationMediaPath that it would not be added if it was specified in a configuration. Now the parameter InstallationMediaPath is correctly passed to Install-ADDSDomainController.
    • Refactored the resource with major code cleanup and localization.
    • Updated unit tests to latest unit test template and refactored the tests for the function “Set-TargetResource”.
    • Improved test code coverage.
  • Changes to xADComputer
    • Restoring a computer account from the recycle bin no longer fails if there is more than one object with the same name in the recycle bin. Now it uses the object that was changed last using the property whenChanged (issue 271).
  • Changes to xADGroup
    • Restoring a group from the recycle bin no longer fails if there is more than one object with the same name in the recycle bin. Now it uses the object that was changed last using the property whenChanged (issue 271).
  • Changes to xADOrganizationalUnit
    • Restoring an organizational unit from the recycle bin no longer fails if there is more than one object with the same name in the recycle bin. Now it uses the object that was changed last using the property whenChanged (issue 271).
  • Changes to xADUser
    • Restoring a user from the recycle bin no longer fails if there is more than one object with the same name in the recycle bin. Now it uses the object that was changed last using the property whenChanged (issue 271).
xDnsServer 1.12.0.0
  • Update appveyor.yml to use the default template.
  • Added default template files .codecov.yml, .gitattributes, and .gitignore, and .vscode folder.
  • Added UseRootHint property to xDnsServerForwarder resource.
xFirefox 1.3.0.0
  • Update appveyor.yml to use the default template.
  • Added default template files .codecov.yml, .gitattributes, and .gitignore, and .vscode folder.
  • The module manifest now contains the correct PowerShell version.
  • Added xFirefoxPreference Resource to automate Firefox Preference Configuration
xPSDesiredStateConfiguration 8.7.0.0
  • MSFT_xWindowsProcess:
    • Fixes issue where a process will fail to be created if a $Path is passed that contains one or more spaces, and the resource is using $Credentials.
    • Fixes issue where a process will fail to be created if $Arguments are passed that contain one or more spaces (with or without credentials).
    • Fixes issue where Integration tests fail if empty Arguments are passed. issue 605
    • Heavily refactors MSFT_xWindowsProcess.Integration.Tests.ps1 and adds more Path and Arguments related test cases.
    • Removes reliance on test file WindowsProcessTestProcess.
  • Fixes test failures in xWindowsOptionalFeatureSet.Integration.Tests.ps1 due to accessing the windowsOptionalFeatureName variable before it is assigned. issue 612
  • MSFT_xDSCWebService
    • Fixes issue 536 and starts the deprecation process for configuring a windows firewall (exception) rule using xDSCWebService
    • Fixes issue 463 and fixes some bugs introduced with the new firewall rule handling
xWebAdministration 2.6.0.0
  • Changed order of classes in schema.mof files to workaround 423
  • Fix subject comparison multiple entries for helper function Find-Certificate that could not find the test helper function Install-NewSelfSignedCertificateExScript.
  • Updated unit test for helper function Find-Certificate to check for multiple subject names in different orders.

How to Find Released DSC Resource Modules

To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

Of course, you can also always use PowerShellGet (available starting in WMF 5.0) to find modules with DSC Resources:

#To list all modules that tagged as DSCResourceKit
Find-Module -Tag DSCResourceKit 
#To list all DSC resources from all sources
Find-DscResource

Please note only those modules released by the PowerShell Team are currently considered part of the ‘DSC Resource Kit’ regardless of the presence of the ‘DSC Resource Kit’ tag in the PowerShell Gallery.

To find a specific module, go directly to its URL on the PowerShell Gallery:
http://www.powershellgallery.com/packages/< module name >
For example:
http://www.powershellgallery.com/packages/xWebAdministration

How to Install DSC Resource Modules From the PowerShell Gallery

We recommend that you use PowerShellGet to install DSC resource modules:

Install-Module -Name < module name >

For example:

Install-Module -Name xWebAdministration

To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

Update-Module

After installing modules, you can discover all DSC resources available to your local system with this command:

Get-DscResource

How to Find DSC Resource Modules on GitHub

All resource modules in the DSC Resource Kit are available open-source on GitHub.
You can see the most recent state of a resource module by visiting its GitHub page at:
https://github.com/PowerShell/< module name >
For example, for the CertificateDsc module, go to:
https://github.com/PowerShell/CertificateDsc.

All DSC modules are also listed as submodules of the DscResources repository in the DscResources folder and the xDscResources folder.

How to Contribute

You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
See our contributing guide for more info on how to become a DSC Resource Kit contributor.

If you would like to help, please take a look at the list of open issues for the DscResources repository.
You can also check issues for specific resource modules by going to:
https://github.com/PowerShell/< module name >/issues
For example:
https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

Your help in developing the DSC Resource Kit is invaluable to us!

Questions, comments?

If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

Katie Kragenbrink
Software Engineer
PowerShell DSC Team
@katiedsc (Twitter)
@kwirkykat (GitHub)

The post DSC Resource Kit Release May 2019 appeared first on PowerShell.

PowerShell 7 Road Map

$
0
0

Last month we announced that PowerShell 7 will be the next release of PowerShell.

Here I will provide more details of areas we’ll be investing in for the PowerShell 7 release.

When will I get it?!

Today, we’re releasing our first preview of PowerShell 7. Keeping with our monthly cadence, expect new preview releases approximately every month.

This first preview contains some of the changes that didn’t make it in time for the 6.2 GA release, and marks our move to .NET Core 3.0. For more details on what’s new, check out our changelog on GitHub.

As mentioned in the PowerShell 7 announcement blog, we will be changing the support life-cycle to align with .NET Core. This means that we expect PowerShell 7 to be generally available (GA) about a month after .NET Core 3.0 GA.

.NET Core 3.0

The biggest immediate change is moving to .NET Core 3.0 (from .NET Core 2.1). Not only are there significant performance improvements, but many new APIs are available including WPF and WinForms (Windows only, though!).

This means that (eventually) we can bring back Out-GridView.

Windows Compatibility

A big focus of PowerShell 7 is making it a viable replacement for Windows PowerShell 5.1. This means it must have near parity with Windows PowerShell in terms of compatibility with modules that ship with Windows.

The PowerShell Team will be working with Windows teams to validate and update their modules to work with PowerShell 7. This also means that to use PowerShell 7 with the breadth of Windows PowerShell modules, you will need to be using the latest builds of Windows 10 (and equivalent Windows Server).

Feature Investigations

We are looking at investing in three specific feature areas. Expect RFCs (Request For Comments specifications) to be published on how we intend to implement these features and the scope of the problem we intended to solve. Feedback is greatly appreciated!

Simplify Secure Credentials Management

Whether you are using PowerShell to automate resources in the cloud, local resources on premise, or a hybrid, you will generally need to have different credentials to access different resources.

The best practice is to never put credentials within your script. So we intend to introduce a way to securely use credentials from a local or remote based credential store.

Logging Off the Box

Part of the security of PowerShell is that it can log everything. However, logging today is purely local onto the machine. For each OS, there are ways to forward those events to a remote system, but it requires different configurations per OS.

We want to introduce a way to easily configure PowerShell through policy to automatically send the logs to a remote target regardless of the OS.

New Version Notification

Looking at our PowerBI dashboard, we see a large number of instances using older versions (some of which are no longer supported). It is important to inform our customers if there is a newer version available that may have security fixes so we need a way to inform the user that a newer version is available politely.

There is already a RFC published for this feature. Please take a look and provide us feedback!

GitHub Issues

We have a number of GitHub issues marked to be considered to be addressed for PowerShell 7. Although we would like to fix everything, we do have limited resources so not every issue marked for consideration will be fixed.

Popular Requested Features

There are a number of requested features we’d like to address in PowerShell 7. Some of these may show up as experimental features so that we can get feedback before we lock in the design.

This list is larger than what we will be able to fix, but these are the ones we’d like to investigate:

Other Investments

My team will also be involved in some other related investments during the time frame of working on PowerShell 7.

PowerShell in Azure Functions to generally available

A few weeks ago, Joey Aiello announced the Public Preview of PowerShell in Azure Functions 2.0. As we get feedback, my team will continue to work with the Azure Functions team to address that feedback and eventually move from public preview to being generally available.

PSReadLine 2.0

Jason Shirk has done a great job with PSReadLine. As part of Windows PowerShell 5, we decided to have it as the default interactive shell experience. As a side project to Jason, he’s made many improvements, but the project is bigger than one person. We’ve agreed to move the PSReadLine project to the PowerShell team where we can have some dedicated resources to get the 2.0.0 release to GA.

PowerShell Editor Services / Visual Studio Code PowerShell extension

We will continue to make progress on PowerShell Editor Services 2.0 release improving reliability, performance, and PSReadLine integration.

PSScriptAnalyzer

As part of improving performance for PSEditorServices, we need to make PSScriptAnalyzer 2.0 host-able so that PS Editor Services can simply call an API rather than calling PSScriptAnalyzer using a PowerShell runspace.

Summary

As you can see, we have a ton of work ahead of us. Not everything will make it in the same time frame of PowerShell 7 and some of the popular requested features may have to wait for PowerShell 7.1, but the more feedback you give us, the more we can be sure we’re doing the right thing to help you succeed.

Thanks!

Steve Lee
Principal Software Engineering Manager
PowerShell Team
https://twitter.com/Steve_MSFT

The post PowerShell 7 Road Map appeared first on PowerShell.

Announcing General Availability of the Windows Compatibility Module 1.0.0

$
0
0

The Windows Compatibility module (WindowsCompatibility) is a PowerShell module that lets PowerShell Core 6 scripts access Windows PowerShell modules that are not yet natively available on PowerShell Core. (Note: the list of unavailable commands is getting smaller with each new release of PowerShell Core. This module is just for things aren’t natively supported yet.)

You can install the module from the PowerShell Gallery using the command

Install-Module WindowsCompatibility

and the source code is available on GitHub. (This is where you should open issues or make suggestions.)

Once you have WindowsCompatibility installed, you can start using it. The first thing you might want to run is Get-WinModule which will show you the list of available modules. From that list, choose a module, say PKI and and load it. To do this, run the following command:

Import-WinModule PKI

and you’ll have the commands exported by the PKI module in your local session. You can run them just like any other command. For example:

New-SelfSignedCertificate -DnsName localhost

As always, you can see what a module exported by doing:

Get-Command -module PKI

just like any other module.

These are the most important commands but the WindowsCompatibility module provides some others:

  • Invoke-WinCommand allows you to invokes a one-time command in the compatibility session.
  • Add-WinFunction allows you to define new functions that operate implicitly in the compatibility session.
  • Compare-WinModule lets you compare what you have against what’s available.
  • Copy-WinModule will let you copy Window PowerShell modules that are known to work in PowerShell 6 to the PowerShell 6 command path.
  • Initialize-WinSession gives you more control on where and how the compatibility session is created. For example. it will allow you to place the compatibility session on another machine.

(See the module’s command help for more details and examples on how to use the WindowsCompatibility functions.)

How It Works

The WindowsCompatibility module takes advantage of the ‘Implicit Remoting‘ feature that has been available in PowerShell since version 2. Implicit remoting works by retrieving command metadata from a remote session and synthesizing proxy functions in the local session. When you call one of these proxy function, it takes all of the parameters passed to it and forwards them to the real command in the “remote” session. Wait a minute you may be thinking – what does remoting have to do with the WindowsCompatibility module? WindowsCompatibility automatically creates and manages a ‘local remote’ session, called the ‘compatibility session’ that runs with Windows PowerShell on the local machine. It imports the specified module and then creates local proxy functions for all of commands defined in that module.

OK – what about modules that exist in both Windows PowerShell and PowerShell core? Yes – you can import them. After all, there are still a fair number of base cmdlets that aren’t available in PowerShell core yet.

So how does this work? WindowsCompatibility is very careful to not overwrite native PowerShell core commands. It only imports the ones that are available with Windows PowerShell but not with PowerShell Core. For example, the following will import the PowerShell default management module

 Import-WinModule  Microsoft.PowerShell.Management

which contains, among others, the Get-EventLog cmdlet. None of the native PowerShell Core cmdlets get overwritten but now you have Get-EventLog available in your session.

At this point, if you call Get-Module, you will see something a bit strange:

Get-Module | ForEach-Object Name

results in output that looks like:

Microsoft.PowerShell.Management
Microsoft.PowerShell.Management.WinModule
Microsoft.PowerShell.Utility
NetTCPIP

Import-WinModule renames the compatibility module at load time to prevent collisions with identically named modules. This is so the module qualified commands will resolve against the current module. In fact, if you want to see what additional commands were imported, you can run:

Get-Command -Module  Microsoft.PowerShell.Management.WinModule

Limitations

Because WindowsCompatibility is based on implicit remoting, there are a number of significant limitations on the cmdlets imported by the module. First, because everything is done using the remoting protocol, the imported cmdlets will return deserialized objects that only contain properties. Much of the time, this won’t matter because the parameter binder binds by property name rather than by object type. As long as the required properties are present on the object, it doesn’t matter what type the object actually is. There are, however, cases where the cmdlet actually requires that the object be of a specific type or that it have methods. WindowsCompatibility won’t work for these cmdlets.

Windows Forms and other graphical tools

The remoting session is considered non-interactive so graphical tools such as notepad or Winforms scripts will either fail, or worse hang.

Linux and Mac support

This module depends on WinRM and the client libraries on these platforms are known to be unstable and limited. So for this release, only PowerShell Core running on Windows is supported. (This may change in the future. But you’ll still need a Windows machine with Windows PowerShell to host the compatibility session.)

PowerShell 6.1 Dependency

WindowsCompatibility depends on a feature introduced in PowerShell Core 6.1 for keeping the current working directory in both the local and compatibility sessions synchronized. Earlier versions of PowerShell will work with WindowsCompatibility but won’t have this directory synchronization feature. So if you’re running PowerShell Core 6.0, import a command that writes to files, do Set-Location to a new directory, then use that command to write to a file with an unqualified path; it will use the original path from when the module was imported rather than your sessions current working directory. On PowerShell Core 6.1, it will correctly use the current working directory.

Summary

To sum it all up, the WindowsCompatibility module provides a set of commands that allow you to access Window PowerShell modules from PowerShell Core 6. There are however, some limitations that make it unsuitable for all scenarios. Over time, as more and more modules are ported to .NET Core/PowerShell 6 natively there will be less need for this module.

Cheers!
Bruce Payette,
PowerShell Team.

PowerShell Constrained Language mode and the Dot-Source Operator

$
0
0

PowerShell Constrained Language mode and the Dot-Source Operator

PowerShell works with application control systems, such as AppLocker and Windows Defender Application Control (WDAC), by automatically running in
ConstrainedLanguage mode. ConstrainedLanguage mode restricts some exploitable aspects of PowerShell while still giving you a rich shell to run commands and scripts in. This is different from usual application white listing rules, where an application is either allowed to run or not.

But there are times when the full power of PowerShell is needed, so we allow script files to run in FullLanguage mode when they are trusted by the policy. Trust can be indicated through file signing or other policy mechanisms such as file hash. However, script typed into the interactive shell is always run constrained.

Since PowerShell can run script in both Full and Constrained language modes, we need to protect the boundary between them. We don’t want to leak variables or functions between sessions running in different language modes.

The PowerShell dot-source operator brings script files into the current session scope. It is a way to reuse script. All script functions and variables defined in the script file become part of the script it is dot sourced into. It is like copying and pasting text from the script file directly into your script.

# HelperFn1, HelperFn2 are defined in HelperFunctions.ps1
# Dot-source the file here to get access to them (no need to copy/paste)
. c:\Scripts\HelperFunctions.ps1
HelperFn1
HelperFn2

This presents a problem when language modes are in effect with system application control. If an untrusted script is dot-sourced into a script with full trust then it has access to all those functions that run in FullLanguage mode, which can result in application control bypass through arbitrary code execution or privilege escalation. Consequently, PowerShell prevents this by throwing an error when dot-sourcing is attempted across language modes.

Example 1:

System is in WDAC policy lock down. To start with, neither script is trusted and so both run in ConstrainedLanguage mode. But the HelperFn1 function uses method invocation which isn’t allowed in that mode.

PS> type c:\MyScript.ps1
Write-Output "Dot sourcing MyHelper.ps1 script file"
. c:\MyHelper.ps1
HelperFn1
PS> type c:\MyHelper.ps1
function HelperFn1
{
    "Language mode: $($ExecutionContext.SessionState.LanguageMode)"
    [System.Console]::WriteLine("This can only run in FullLanguage mode!")
}
PS> c:\MyScript.ps1
Dot sourcing MyHelper.ps1 script file
Language mode: ConstrainedLanguage
Cannot invoke method. Method invocation is supported only on core types in this language mode.
At C:\MyHelper.ps1:4 char:5
+     [System.Console]::WriteLine("This cannot run in ConstrainedLangua ...
+     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : MethodInvocationNotSupportedInConstrainedLanguage

Both scripts are untrusted and run in ConstrainedLanguage mode, so dot-sourcing the MyHelper.ps1 file works. However, the HelperFn1 function performs method invocation that is not allowed in ConstrainedLanguage and fails when run. MyHelper.ps1 needs to be signed as trusted so it can run at FullLanguage.

Next we have mixed language modes. MyHelper.ps1 is signed and trusted, but MyScript.ps1 is not.

PS> c:\MyScript.ps1
Dot sourcing MyHelper.ps1 script file
C:\MyHelper.ps1 : Cannot dot-source this command because it was defined in a different language mode. To invoke this command without importing its contents, omit the '.' operator.
At C:\MyScript.ps1:2 char:1
+ . 'c:\MyHelper.ps1'
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [MyHelper.ps1], NotSupportedException
    + FullyQualifiedErrorId : DotSourceNotSupported,MyHelper.ps1
...

And we get a dot-source error because we are trying to dot-source script that has a different language mode than the session it is being dot-sourced into.

Finally, we sign as trusted both script files and everything works.

PS> c:\MyScript.ps1
Dot sourcing MyHelper.ps1 script file
Language mode: FullLanguage
This can only run in FullLanguage mode!

The lesson here is to ensure all script components run in the same language mode on policy locked down systems. If one component must run in FullLanguage mode, then all components should run in FullLanguage mode. This means validating that each component is safe to run in FullLanguage and indicating they are trusted to the application control policy.

So this solves all language mode problems, right? If FullLanguage is not needed then just ensure all script components run untrusted, which is the default condition. If they require FullLanguage then carefully validate all components and mark them as trusted. Unfortuantely, there is one case where this best practice doesn’t work.

PowerShell Profile File

The PowerShell profile file (profile.ps1) is loaded and run at PowerShell start up. If that script requires FullLanguage mode on policy lock down systems, you just validate and sign the file as trusted, right?

Example 2:

PS> type c:\users\<user>\Documents\WindowsPowerShell\profile.ps1
Write-Output "Running Profile"
[System.Console]::WriteLine("This can only run in FullLanguage!")
# Sign file so it is trusted and will run in FullLanguage mode
PS> Set-AuthenticodeSignature -FilePath .\Profile.ps1 -Certificate $myPolicyCert
# Start a new PowerShell session and run the profile script
PS> powershell.exe
Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.
C:\Users\<user>\Documents\WindowsPowerShell\profile.ps1 : Cannot dot-source this command because it was defined in a different language mode. To invoke this command without importing its contents, omit the '.' operator.
At line:1 char:1
+ . 'C:\Users\<user>\Documents\WindowsPowerShell\profile.ps1'
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (:) [profile.ps1], NotSupportedException
    + FullyQualifiedErrorId : DotSourceNotSupported,profile.ps1

What gives? The profile.ps1 file was signed and is policy trusted. Why the error?
Well, the issue is that PowerShell dot-sources the profile.ps1 file into the default PowerShell session, which must run in ConstrainedLanguage because of the policy. So we are attempting to dot-source a FullLanguage script into a ConstrainedLanguage session, and that is not allowed. This is a catch 22 because if the profile.ps1 is not signed, it may not run if it needs FullLanguage privileges (e.g., invoke methods). But if you sign it, it still won’t run because of how it is dot-sourced into the current ConstrainedLanguage interactive session.

Unfortunately, the only solution is to keep the profile.ps1 file fairly simple so that it does not need FullLanguage, and refrain from making it trusted. Keep in mind that this is only an issue when running with application control policy. Otherwise, language modes do not come into play and PowerShell profile files run normally.

Paul Higinbotham
Senior Software Engineer
PowerShell Team

DSC Resource Kit Release November 2018

$
0
0

We just released the DSC Resource Kit!

This release includes updates to 9 DSC resource modules. In the past 6 weeks, 61 pull requests have been merged and 67 issues have been closed, all thanks to our amazing community!

The modules updated in this release are:

  • AuditPolicyDsc
  • DFSDsc
  • NetworkingDsc
  • SecurityPolicyDsc
  • SharePointDsc
  • StorageDsc
  • xBitlocker
  • xExchange
  • xHyper-V

For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

Our latest community call for the DSC Resource Kit was supposed to be today, November 28, but the public link to the call expired, so the call was cancelled. I will update the link for next time. If there is interest in rescheduling this call, the new call time will be announced on Twitter (@katiedsc or @migreene) The call for the next release cycle is also getting moved a week later than usual to January 9 at 12PM (Pacific standard time). Join us to ask questions and give feedback about your experience with the DSC Resource Kit.

The next DSC Resource Kit release will be on Wednesday, January 9.

We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

Please see our documentation here for information on the support of these resource modules.

Included in this Release

You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or CHANGELOG.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

Module Name Version Release Notes
AuditPolicyDsc 1.3.0.0
  • Update LICENSE file to match the Microsoft Open Source Team standard.
  • Added the AuditPolicyGuid resource.
DFSDsc 4.2.0.0
  • Add support for modifying staging quota size in MSFT_DFSReplicationGroupMembership – fixes Issue 77.
  • Refactored module folder structure to move resource to root folder of repository and remove test harness – fixes Issue 74.
  • Updated Examples to support deployment to PowerShell Gallery scripts.
  • Remove exclusion of all tags in appveyor.yml, so all common tests can be run if opt-in.
  • Added .VSCode settings for applying DSC PSSA rules – fixes Issue 75.
  • Updated LICENSE file to match the Microsoft Open Source Team standard – fixes Issue 79
NetworkingDsc 6.2.0.0
  • Added .VSCode settings for applying DSC PSSA rules – fixes Issue 357.
  • Updated LICENSE file to match the Microsoft Open Source Team standard – fixes Issue 363
  • MSFT_NetIPInterface:
    • Added a new resource for configuring the IP interface settings for a network interface.
SecurityPolicyDsc 2.6.0.0
  • Added SecurityOption – Network_access_Restrict_clients_allowed_to_make_remote_calls_to_SAM
  • Bug fix – Issue 105 – Spelling error in SecurityOption”User_Account_Control_Behavior_of_the_elevation_prompt_for_standard_users”
  • Bug fix – Issue 90 – Corrected value for Microsoft_network_server_Server_SPN_target_name_validation_level policy
SharePointDsc 3.0.0.0
  • Changes to SharePointDsc
    • Added support for SharePoint 2019
    • Added CredSSP requirement to the Readme files
    • Added VSCode Support for running SharePoint 2019 unit tests
    • Removed the deprecated resources SPCreateFarm and SPJoinFarm (replaced in v2.0 by SPFarm)
  • SPBlobCacheSettings
    • Updated the Service Instance retrieval to be language independent
  • SPConfigWizard
    • Fixed check for Ensure=Absent in the Set method
  • SPInstallPrereqs
    • Added support for detecting updated installation of Microsoft Visual C++ 2015/2017 Redistributable (x64) for SharePoint 2016 and SharePoint 2019.
  • SPSearchContentSource
    • Added support for Business Content Source Type
  • SPSearchMetadataCategory
    • New resource added
  • SPSearchServiceApp
    • Updated resource to make sure the presence of the service app proxy is checked and created if it does not exist
  • SPSecurityTokenServiceConfig
    • The resource only tested for the Ensure parameter. Added more parameters
  • SPServiceAppSecurity
    • Added support for specifying array of access levels.
    • Changed implementation to use Grant-SPObjectSecurity with Replace switch instead of using a combination of Revoke-SPObjectSecurity and Grant-SPObjectSecurity
    • Added all supported access levels as available values.
    • Removed unknown access levels: Change Permissions, Write, and Read
  • SPUserProfileProperty
    • Removed obsolete parameters (MappingConnectionName, MappingPropertyName, MappingDirection) and introduced new parameter PropertyMappings
  • SPUserProfileServiceApp
    • Updated the check for successful creation of the service app to throw an error if this is not done correctly The following changes will break v2.x and earlier configurations that use these resources:
  • Implemented IsSingleInstance parameter to force that the resource can only be used once in a configuration for the following resources:
    • SPAntivirusSettings
    • SPConfigWizard
    • SPDiagnosticLoggingSettings
    • SPFarm
    • SPFarmAdministrators
    • SPInfoPathFormsServiceConfig
    • SPInstall
    • SPInstallPrereqs
    • SPIrmSettings
    • SPMinRoleCompliance
    • SPPasswordChangeSettings
    • SPProjectServerLicense
    • SPSecurityTokenServiceConfig
    • SPShellAdmin
  • Standardized Url/WebApplication parameter to default WebAppUrl parameter for the following resources:
    • SPDesignerSettings
    • SPFarmSolution
    • SPSelfServiceSiteCreation
    • SPWebAppBlockedFileTypes
    • SPWebAppClientCallableSettings
    • SPWebAppGeneralSettings
    • SPWebApplication
    • SPWebApplicationAppDomain
    • SPWebAppSiteUseAndDeletion
    • SPWebAppThrottlingSettings
    • SPWebAppWorkflowSettings
  • Introduced new mandatory parameters
    • SPSearchResultSource: Added option to create Result Sources at different scopes.
    • SPServiceAppSecurity: Changed parameter AccessLevel to AccessLevels in MSFT_SPServiceAppSecurityEntry to support array of access levels.
    • SPUserProfileProperty: New parameter PropertyMappings
SharePointDsc 3.1.0.0
  • Changes to SharePointDsc
    • Updated LICENSE file to match the Microsoft Open Source Team standard.
  • ProjectServerConnector
    • Added a file hash validation check to prevent the ability to load custom code into the module.
  • SPFarm
    • Fixed localization issue where TypeName was in the local language.
  • SPInstallPrereqs
    • Updated links in the Readme.md file to docs.microsoft.com.
    • Fixed required prereqs for SharePoint 2019, added MSVCRT11.
  • SPManagedMetadataServiceApp
    • Fixed issue where Get-TargetResource method throws an error when the service app proxy does not exist.
  • SPSearchContentSource
    • Corrected issue where the New-SPEnterpriseSearchCrawlContentSource cmdlet was called twice.
  • SPSearchServiceApp
    • Fixed issue where Get-TargetResource method throws an error when the service application pool does not exist.
    • Implemented check to make sure cmdlets are only executed when it actually has something to update.
    • Deprecated WindowsServiceAccount parameter and moved functionality to new resource (SPSearchServiceSettings).
  • SPSearchServiceSettings
    • Added new resource to configure search service settings.
  • SPServiceAppSecurity
    • Fixed unavailable utility method (ExpandAccessLevel).
    • Updated the schema to no longer specify username as key for the sub class.
  • SPUserProfileServiceApp
    • Fixed issue where localized versions of Windows and SharePoint would throw an error.
  • SPUserProfileSyncConnection
    • Corrected implementation of Ensure parameter.
StorageDsc 4.3.0.0
  • WaitForDisk:
    • Added readonly-property isAvailable which shows the current state of the disk as a boolean – fixes Issue 158.
xBitlocker 1.3.0.0
  • Update appveyor.yml to use the default template.
  • Added default template files .gitattributes, and .vscode settings.
  • Fixes most PSScriptAnalyzer issues.
  • Fix issue where AutoUnlock is not set if requested, if the disk was originally encrypted and AutoUnlock was not used.
  • Add remaining Unit Tests for xBitlockerCommon.
  • Add Unit tests for MSFT_xBLTpm
  • Add remaining Unit Tests for xBLAutoBitlocker
  • Add Unit tests for MSFT_xBLBitlocker
  • Moved change log to CHANGELOG.md file
  • Fixed Markdown validation warnings in README.md
  • Added .MetaTestOptIn.json file to root of module
  • Add Integration Tests for module resources
  • Rename functions with improper Verb-Noun constructs
  • Add comment based help to any functions without it
  • Update Schema.mof Description fields
  • Fixes issue where Switch parameters are passed to Enable-Bitlocker even if the corresponding DSC resource parameter was set to False (Issue 12)
xExchange 1.25.0.0
  • Opt-in for the common test flagged Script Analyzer rules (issue 234).
  • Opt-in for the common test testing for relative path length.
  • Removed the property PSDscAllowPlainTextPassword from all examples so the examples are secure by default. The property PSDscAllowPlainTextPassword was previously needed to (test) compile the examples in the CI pipeline, but now the CI pipeline is using a certificate to compile the examples.
  • Opt-in for the common test that validates the markdown links.
  • Fix typo of the word “Certificate” in several example files.
  • Add spaces between array members.
  • Add initial set of Unit Tests (mostly Get-TargetResource tests) for all remaining resource files.
  • Add WaitForComputerObject parameter to xExchWaitForDAG
  • Add spaces between comment hashtags and comments.
  • Add space between variable types and variables.
  • Fixes issue where xExchMailboxDatabase fails to test for a Journal Recipient because the module did not load the Get-Recipient cmdlet (335).
  • Fixes broken Integration tests in MSFT_xExchMaintenanceMode.Integration.Tests.ps1 (336).
  • Fix issue where Get-ReceiveConnector against an Absent connector causes an error to be logged in the MSExchange Management log.
  • Rename poorly named functions in xExchangeDiskPart.psm1 and MSFT_xExchAutoMountPoint.psm1, and add comment based help.
xHyper-V 3.14.0.0
  • MSFT_xVMHost:
    • Added support to Enable / Disable VM Live Migration. Fixes Issue 155.

How to Find Released DSC Resource Modules

To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

Of course, you can also always use PowerShellGet (available starting in WMF 5.0) to find modules with DSC Resources:

# To list all modules that tagged as DSCResourceKit
Find-Module -Tag DSCResourceKit 
# To list all DSC resources from all sources 
Find-DscResource

Please note only those modules released by the PowerShell Team are currently considered part of the ‘DSC Resource Kit’ regardless of the presence of the ‘DSC Resource Kit’ tag in the PowerShell Gallery.

To find a specific module, go directly to its URL on the PowerShell Gallery:
http://www.powershellgallery.com/packages/< module name >
For example:
http://www.powershellgallery.com/packages/xWebAdministration

How to Install DSC Resource Modules From the PowerShell Gallery

We recommend that you use PowerShellGet to install DSC resource modules:

Install-Module -Name < module name >

For example:

Install-Module -Name xWebAdministration

To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

Update-Module

After installing modules, you can discover all DSC resources available to your local system with this command:

Get-DscResource

How to Find DSC Resource Modules on GitHub

All resource modules in the DSC Resource Kit are available open-source on GitHub.
You can see the most recent state of a resource module by visiting its GitHub page at:
https://github.com/PowerShell/< module name >
For example, for the CertificateDsc module, go to:
https://github.com/PowerShell/CertificateDsc.

All DSC modules are also listed as submodules of the DscResources repository in the DscResources folder and the xDscResources folder.

How to Contribute

You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
See our contributing guide for more info on how to become a DSC Resource Kit contributor.

If you would like to help, please take a look at the list of open issues for the DscResources repository.
You can also check issues for specific resource modules by going to:
https://github.com/PowerShell/< module name >/issues
For example:
https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

Your help in developing the DSC Resource Kit is invaluable to us!

Questions, comments?

If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

Katie Kragenbrink
Software Engineer
PowerShell DSC Team
@katiedsc (Twitter)
@kwirkykat (GitHub)

DSC Resource Kit Release January 2019

$
0
0

We just released the DSC Resource Kit!

This release includes updates to 14 DSC resource modules. In the past 6 weeks, 41 pull requests have been merged and 54 issues have been closed, all thanks to our amazing community!

The modules updated in this release are:

  • ActiveDirectoryCSDsc
  • AuditPolicyDsc
  • CertificateDsc
  • ComputerManagementDsc
  • NetworkingDsc
  • SecurityPolicyDsc
  • SqlServerDsc
  • StorageDsc
  • xActiveDirectory
  • xBitlocker
  • xExchange
  • xFailOverCluster
  • xHyper-V
  • xWebAdministration

Several of these modules were released to remove the hidden files/folders from this issue. This issue should now be fixed for all modules except DFSDsc which is waiting for some fixes to its tests.

For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

Our latest community call for the DSC Resource Kit was today, January 9. A recording is available on YouTube here. Join us for the next call at 12PM (Pacific time) on February 13 to ask questions and give feedback about your experience with the DSC Resource Kit.

The next DSC Resource Kit release will be on Wednesday, February 20.

We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

Please see our documentation here for information on the support of these resource modules.

Included in this Release

You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or CHANGELOG.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

Module Name Version Release Notes
ActiveDirectoryCSDsc 3.1.0.0
  • Updated LICENSE file to match the Microsoft Open Source Team standard.
  • Added .VSCode settings for applying DSC PSSA rules – fixes Issue 60.
  • Added fix for two tier PKI deployment fails on initial deployment, not error – fixes Issue 57.
AuditPolicyDsc 1.4.0.0
  • Explicitly removed extra hidden files from release package
CertificateDsc 4.3.0.0
  • Updated certificate import to only use Import-CertificateEx – fixes Issue 161
  • Update LICENSE file to match the Microsoft Open Source Team standard -fixes Issue 164.
  • Opted into Common Tests – fixes Issue 168:
    • Required Script Analyzer Rules
    • Flagged Script Analyzer Rules
    • New Error-Level Script Analyzer Rules
    • Custom Script Analyzer Rules
    • Validate Example Files To Be Published
    • Validate Markdown Links
    • Relative Path Length
  • CertificateExport:
    • Fixed bug causing PFX export with matchsource enabled to fail – fixes Issue 117
ComputerManagementDsc 6.1.0.0
  • Updated LICENSE file to match the Microsoft Open Source Team standard. Fixes Issue 197.
  • Explicitly removed extra hidden files from release package
NetworkingDsc 6.3.0.0
  • MSFT_IPAddress:
    • Updated to allow retaining existing addresses in order to support cluster configurations as well
SecurityPolicyDsc 2.7.0.0
  • Bug fix – Issue 83 – Network_access_Remotely_accessible_registry_paths_and_subpaths correctly applies multiple paths
  • Update LICENSE file to match the Microsoft Open Source Team standard
SqlServerDsc 12.2.0.0
  • Changes to SqlServerDsc
    • During testing in AppVeyor the Build Worker is restarted in the install step to make sure the are no residual changes left from a previous SQL Server install on the Build Worker done by the AppVeyor Team (issue 1260).
    • Code cleanup: Change parameter names of Connect-SQL to align with resources.
    • Updated README.md in the Examples folder.
      • Added a link to the new xADObjectPermissionEntry examples in ActiveDirectory, fixed a broken link and a typo. Adam Rush (@adamrushuk)
    • Change to SqlServerLogin so it doesn”t check properties for absent logins.
StorageDsc 4.4.0.0
  • Refactored module folder structure to move resource to root folder of repository and remove test harness – fixes Issue 169.
  • Updated Examples to support deployment to PowerShell Gallery scripts.
  • Removed limitation on using Pester 4.0.8 during AppVeyor CI.
  • Moved the Code of Conduct text out of the README.md and into a CODE_OF_CONDUCT.md file.
  • Explicitly removed extra hidden files from release package
xActiveDirectory 2.23.0.0
  • Explicitly removed extra hidden files from release package
xBitlocker 1.4.0.0
  • Change double quoted string literals to single quotes
  • Add spaces between array members
  • Add spaces between variable types and variable names
  • Add spaces between comment hashtag and comments
  • Explicitly removed extra hidden files from release package
xExchange 1.26.0.0
  • Add support for Exchange Server 2019
  • Added additional parameters to the MSFT_xExchUMService resource
  • Rename improperly named functions, and add comment based help in MSFT_xExchClientAccessServer, MSFT_xExchDatabaseAvailabilityGroupNetwork, MSFT_xExchEcpVirtualDirectory, MSFT_xExchExchangeCertificate, MSFT_xExchImapSettings.
  • Added additional parameters to the MSFT_xExchUMCallRouterSettings resource
  • Rename improper function names in MSFT_xExchDatabaseAvailabilityGroup, MSFT_xExchJetstress, MSFT_xExchJetstressCleanup, MSFT_xExchMailboxDatabase, MSFT_xExchMailboxDatabaseCopy, MSFT_xExchMailboxServer, MSFT_xExchMaintenanceMode, MSFT_xExchMapiVirtualDirectory, MSFT_xExchOabVirtualDirectory, MSFT_xExchOutlookAnywhere, MSFT_xExchOwaVirtualDirectory, MSFT_xExchPopSettings, MSFT_xExchPowershellVirtualDirectory, MSFT_xExchReceiveConnector, MSFT_xExchWaitForMailboxDatabase, and MSFT_xExchWebServicesVirtualDirectory.
  • Add remaining unit and integration tests for MSFT_xExchExchangeServer.
xFailOverCluster 1.12.0.0
  • Explicitly removed extra hidden files from release package
xHyper-V 3.15.0.0
  • Explicitly removed extra hidden files from release package
xWebAdministration 2.4.0.0
  • Explicitly removed extra hidden files from release package

How to Find Released DSC Resource Modules

To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

Of course, you can also always use PowerShellGet (available starting in WMF 5.0) to find modules with DSC Resources:

# To list all modules that tagged as DSCResourceKit
Find-Module -Tag DSCResourceKit 
# To list all DSC resources from all sources 
Find-DscResource

Please note only those modules released by the PowerShell Team are currently considered part of the ‘DSC Resource Kit’ regardless of the presence of the ‘DSC Resource Kit’ tag in the PowerShell Gallery.

To find a specific module, go directly to its URL on the PowerShell Gallery:
http://www.powershellgallery.com/packages/< module name >
For example:
http://www.powershellgallery.com/packages/xWebAdministration

How to Install DSC Resource Modules From the PowerShell Gallery

We recommend that you use PowerShellGet to install DSC resource modules:

Install-Module -Name < module name >

For example:

Install-Module -Name xWebAdministration

To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

Update-Module

After installing modules, you can discover all DSC resources available to your local system with this command:

Get-DscResource

How to Find DSC Resource Modules on GitHub

All resource modules in the DSC Resource Kit are available open-source on GitHub.
You can see the most recent state of a resource module by visiting its GitHub page at:
https://github.com/PowerShell/< module name >
For example, for the CertificateDsc module, go to:
https://github.com/PowerShell/CertificateDsc.

All DSC modules are also listed as submodules of the DscResources repository in the DscResources folder and the xDscResources folder.

How to Contribute

You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
See our contributing guide for more info on how to become a DSC Resource Kit contributor.

If you would like to help, please take a look at the list of open issues for the DscResources repository.
You can also check issues for specific resource modules by going to:
https://github.com/PowerShell/< module name >/issues
For example:
https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

Your help in developing the DSC Resource Kit is invaluable to us!

Questions, comments?

If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

Katie Kragenbrink
Software Engineer
PowerShell DSC Team
@katiedsc (Twitter)
@kwirkykat (GitHub)

Windows Security change affecting PowerShell

$
0
0

Windows Security change affecting PowerShell

January 9, 2019

The recent (1/8/2019) Windows security patch CVE-2019-0543, has introduced a breaking change for a PowerShell remoting scenario. It is a narrowly scoped scenario that should have low impact for most users.

The breaking change only affects local loopback remoting, which is a PowerShell remote connection made back to the same machine, while using non-Administrator credentials.

PowerShell remoting endpoints do not allow access to non-Administrator accounts by default. However, it is possible to modify endpoint configurations, or create new custom endpoint configurations, that do allow non-Administrator account access. So you would not be affected by this change, unless you explicitly set up loopback endpoints on your machine to allow non-Administrator account access.

Example of broken loopback scenario

# Create endpoint that allows Users group access
PS > Register-PSSessionConfiguration -Name MyNonAdmin -SecurityDescriptorSddl 'O:NSG:BAD:P(A;;GA;;;BA)(A;;GA;;;BU)S:P(AU;FA;GA;;;WD)(AU;SA;GXGW;;;WD)' -Force

# Create non-Admin credential
PS > $nonAdminCred = Get-Credential ~\NonAdminUser

# Create a loopback remote session to custom endpoint using non-Admin credential
PS > $session = New-PSSession -ComputerName localhost -ConfigurationName MyNonAdmin -Credential $nonAdminCred

New-PSSession : [localhost] Connecting to remote server localhost failed with the following error message : The WSMan
service could not launch a host process to process the given request.  Make sure the WSMan provider host server and
proxy are properly registered. For more information, see the about_Remote_Troubleshooting Help topic.
At line:1 char:1
+ New-PSSession -ComputerName localhost -ConfigurationName MyNonAdmin - ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : OpenError: (System.Manageme....RemoteRunspace:RemoteRunspace) [New-PSSession], PSRemotin
   gTransportException
    + FullyQualifiedErrorId : -2146959355,PSSessionOpenFailed

The above example fails only when using non-Administrator credentials, and the connection is made back to the same machine (localhost). Administrator credentials still work. And the above scenario will work when remoting off-box to another machine.

Example of working loopback scenario

# Create Admin credential
PS > $adminCred = Get-Credential ~\AdminUser

# Create a loopback remote session to custom endpoint using Admin credential
PS > $session = New-PSSession -ComputerName localhost -ConfigurationName MyNonAdmin -Credential $adminCred
PS > $session

 Id Name            ComputerName    ComputerType    State         ConfigurationName     Availability
 -- ----            ------------    ------------    -----         -----------------     ------------
  1 WinRM1          localhost       RemoteMachine   Opened        MyNonAdmin               Available

The above example uses Administrator credentials to the same MyNonAdmin custom endpoint, and the connection is made back to the same machine (localhost). The session is created successfully using Administrator credentials.

The breaking change is not in PowerShell but in a system security fix that restricts process creation between Windows sessions. This fix is preventing WinRM (which PowerShell uses as a remoting transport and host) from successfully creating the remote session host, for this particular scenario. There are no plans to update WinRM.

This affects Windows PowerShell and PowerShell Core 6 (PSCore6) WinRM based remoting.

This does not affect SSH remoting with PSCore6.

This does not affect JEA (Just Enough Administration) sessions.

A workaround for a loopback connection is to always use Administrator credentials.

Another option is to use PSCore6 with SSH remoting.

Paul Higinbotham
Senior Software Engineer
PowerShell Team


Parsing Text with PowerShell (1/3)

$
0
0

This is the first post in a three part series.

  • Part 1:
    • Useful methods on the String class
    • Introduction to Regular Expressions
    • The Select-String cmdlet
  • Part 2:
    • The -split operator
    • The -match operator
    • The switch statement
    • The Regex class
  • Part 3:
    • A real world, complete and slightly bigger, example of a switch-based parser

A task that appears regularly in my workflow is text parsing. It may be about getting a token from a single line of text or about turning the text output of native tools into structured objects so I can leverage the power of PowerShell.

I always strive to create structure as early as I can in the pipeline, so that later on I can reason about the content as properties on objects instead of as text at some offset in a string. This also helps with sorting, since the properties can have their correct type, so that numbers, dates etc. are sorted as such and not as text.

There are a number of options available to a PowerShell user, and I’m giving an overview here of the most common ones.

This is not a text about how to create a high performance parser for a language with a structured EBNF grammar. There are better tools available for that, for example ANTLR.

.Net methods on the string class

Any treatment of string parsing in PowerShell would be incomplete if it didn’t mention the methods on the string class.
There are a few methods that I’m using more often than others when parsing strings:

Name Description
Substring(int startIndex) Retrieves a substring from this instance. The substring starts at a specified character position and continues to the end of the string.
Substring(int startIndex, int length) Retrieves a substring from this instance. The substring starts at a specified character position and has a specified length.
IndexOf(string value) Reports the zero-based index of the first occurrence of the specified string in this instance.
IndexOf(string value, int startIndex) Reports the zero-based index of the first occurrence of the specified string in this instance. The search starts at a specified character position.
LastIndexOf(string value) Reports the zero-based index of the last occurrence of the specified string in this instance. Often used together with Substring.
LastIndexOf(string value, int startIndex) Reports the zero-based index position of the last occurrence of a specified string within this instance. The search starts at a specified character position and proceeds backward toward the beginning of the string.

This is a minor subset of the available functions. It may be well worth your time to read up on the string class since it is so fundamental in PowerShell.
Docs are found here.

As an example, this can be useful when we have very large input data of comma-separated input with 15 columns and we are only interested in the third column from the end. If we were to use the -split ',' operator, we would create 15 new strings and an array for each line. On the other hand, using LastIndexOf on the input string a few times and then SubString to get the value of interest is faster and results in just one new string.

function parseThirdFromEnd([string]$line){
    $i = $line.LastIndexOf(",")             # get the last separator
    $i = $line.LastIndexOf(",", $i - 1)     # get the second to last separator, also the end of the column we are interested in
    $j = $line.LastIndexOf(",", $i - 1)     # get the separator before the column we want
    $j++                                    # more forward past the separator
    $line.SubString($j,$i-$j)               # get the text of the column we are looking for
}

In this sample, I ignore that the IndexOf and LastIndexOf returns -1 if they cannot find the text to search for. From experience, I also know that it is easy to mess up the index arithmetics.
So while using these methods can improve performance, it is also more error prone and a lot more to type. I would only resort to this when I know the input data is very large and performance is an issue. So this is not a recommendation, or a starting point, but something to resort to.

On rare occasions, I write the whole parser in C#. An example of this is in a module wrapping the Perforce version control system, where the command line tool can output python dictionaries. It is a binary format, and the use case was complicated enough that I was more comfortable with a compiler checked implementation language.

Regular Expressions

Almost all of the parsing options in PowerShell make use of regular expressions, so I will start with a short intro of some regular expression concepts that are used later in these posts.

Regular expressions are very useful to know when writing simple parsers since they allow us to express patterns of interest and to capture text that matches those patterns.

It is a very rich language, but you can get quite a long way by learning a few key parts. I’ve found regular-expressions.info to be a good online resource for more information. It is not written directly for the .net regex implementation, but most of the information is valid across the different implementations.

Regex Description
* Zero or more of the preceding character. a* matches the empty string, a, aa, etc, but not b.
+ One or more of the preceding character. a+ matches a, aa, etc, but not the empty string or b.
. Matches any character
[ax1] Any of a,x,1
a-d matches any of a,b,c,d
\w The \w meta character is used to find a word character. A word character is a character from a-z, A-Z, 0-9, including the _ (underscore) character. It also matches variants of the characters such as ??? and ???.
\W The inversion of \w. Matches any non-word character
\s The \s meta character is used to find white space
\S The inversion of \s. Matches any non-whitespace character
\d Matches digits
\D The inversion of \d. Matches non-digits
\b Matches a word boundary, that is, the position between a word and a space.
\B The inversion of \b. . er\B matches the er in verb but not the er in never.
^ The beginning of a line
$ The end of a line
(<expr>) Capture groups

Combining these, we can create a pattern like below to match a text like:

Text Pattern
" 42,Answer" ^\s+\d+,.+

The above pattern can be written like this using the x (ignore pattern whitespace) modifier.

Starting the regex with (?x) ignores whitespace in the pattern (it has to be specified explicitly, with \s) and also enables the comment character #.

(?x)  # this regex ignores whitespace in the pattern. Makes it possible do document a regex with comments.
^     # the start of the line
\s+   # one or more whitespace character
\d+   # one or more digits
,     # a comma
.+    # one or more characters of any kind

By using capture groups, we make it possible to refer back to specific parts of a matched expression.

Text Pattern
" 42,Answer" ^\s+(\d+),(.+)
(?x)  # this regex ignores whitespace in the pattern. Makes it possible to document a regex with comments.
^     # the start of the line
\s+   # one or more whitespace character
(\d+) # capture one or more digits in the first group (index 1)
,     # a comma
(.+)  # capture one or more characters of any kind in the second group (index 2)

Naming regular expression groups

There is a construct called named capturing groups, (?<group_name>pattern), that will create a capture group with a designated name.

The regex above can be rewritten like this, which allows us to refer to the capture groups by name instead of by index.

^\s+(?<num>\d+),(?<text>.+)

Different languages have implementation specific solutions to accessing the values of the captured groups. We will see later on in this series how it is done in PowerShell.

The Select-String cmdlet

The Select-String command is a work horse, and is very powerful when you understand the output it produces.
I use it mainly when searching for text in files, but occasionally also when looking for something in command output and similar.

The key to being efficient with Select-String is to know how to get to the matched patterns in the output. In its internals, it uses the same regex class as the -match and -split operator, but instead of populating a global variable with the resulting groups, as -match does, it writes an object to the pipeline, with a Matches property that contains the results of the match.

Set-Content twitterData.txt -value @"
Lee, Steve-@Steve_MSFT,2992
Lee Holmes-13000 @Lee_Holmes
Staffan Gustafsson-463 @StaffanGson
Tribbiani, Joey-@Matt_LeBlanc,463400
"@

# extracting captured groups
Get-ChildItem twitterData.txt |
    Select-String -Pattern "^(\w+) ([^-]+)-(\d+) (@\w+)" |
    Foreach-Object {
        $first, $last, $followers, $handle = $_.Matches[0].Groups[1..4].Value   # this is a common way of getting the groups of a call to select-string
        [PSCustomObject] @{
            FirstName = $first
            LastName = $last
            Handle = $handle
            TwitterFollowers = [int] $followers
        }
    }
FirstName LastName   Handle       TwitterFollowers
--------- --------   ------       ----------------
Lee       Holmes     @Lee_Holmes             13000
Staffan   Gustafsson @StaffanGson              463

Support for Multiple Patterns

As we can see above, only half of the data matched the pattern to Select-String.

A technique that I find useful is to take advantage of the fact that Select-String supports the use of multiple patterns.

The lines of input data in twitterData.txt contain the same type of information, but they’re formatted in slightly different ways.
Using multiple patterns in combination with named capture groups makes it a breeze to extract the groups even when the positions of the groups differ.

$firstLastPattern = "^(?<first>\w+) (?<last>[^-]+)-(?<followers>\d+) (?<handle>@.+)"
$lastFirstPattern = "^(?<last>[^\s,]+),\s+(?<first>[^-]+)-(?<handle>@[^,]+),(?<followers>\d+)"
Get-ChildItem twitterData.txt |
     Select-String -Pattern $firstLastPattern, $lastFirstPattern |
    Foreach-Object {
        # here we access the groups by name instead of by index
        $first, $last, $followers, $handle = $_.Matches[0].Groups['first', 'last', 'followers', 'handle'].Value
        [PSCustomObject] @{
            FirstName = $first
            LastName = $last
            Handle = $handle
            TwitterFollowers = [int] $followers
        }
    }
FirstName LastName   Handle        TwitterFollowers
--------- --------   ------        ----------------
Steve     Lee        @Steve_MSFT               2992
Lee       Holmes     @Lee_Holmes              13000
Staffan   Gustafsson @StaffanGson               463
Joey      Tribbiani  @Matt_LeBlanc           463400

Breaking down the $firstLastPattern gives us

(?x)                # this regex ignores whitespace in the pattern. Makes it possible do document a regex with comments.
^                   # the start of the line
(?<first>\w+)       # capture one or more of any word characters into a group named 'first'
\s                  # a space
(?<last>[^-]+)      # capture one of more of any characters but `-` into a group named 'last'
-                   # a '-'
(?<followers>\d+)   # capture 1 or more digits into a group named 'followers'
\s                  # a space
(?<handle>@.+)      # capture a '@' followed by one or more characters into a group named 'handle'

The second regex is similar, but with the groups in different order. But since we retrieve the groups by name, we don’t have to care about the positions of the capture groups, and multiple assignment works fine.

Context around Matches

Select-String also has a Context parameter which accepts an array of one or two numbers specifying the number of lines before and after a match that should be captured. All text parsing techniques in this post can be used to parse information from the context lines.
The result object has a Context property, that returns an object with PreContext and PostContext properties, both of the type string[].

This can be used to get the second line before a match:

# using the context property
Get-ChildItem twitterData.txt |
    Select-String -Pattern "Staffan" -Context 2,1 |
    Foreach-Object { $_.Context.PreContext[1], $_.Context.PostContext[0] }
Lee Holmes-13000 @Lee_Holmes
Tribbiani, Joey-@Matt_LeBlanc,463400

To understand the indexing of the Pre- and PostContext arrays, consider the following:

Lee, Steve-@Steve_MSFT,2992                  <- PreContext[0]
Lee Holmes-13000 @Lee_Holmes                 <- PreContext[1]
Staffan Gustafsson-463 @StaffanGson          <- Pattern matched this line
Tribbiani, Joey-@Matt_LeBlanc,463400         <- PostContext[0]

The pipeline support of Select-String makes it different from the other parsing tools available in PowerShell, and makes it the undisputed king of one-liners.

I would like stress how much more useful Select-String becomes once you understand how to get to the parts of the matches.

Summary

We have looked at useful methods of the string class, especially how to use Substring to get to text at a specific offset. We also looked at regular expression, a language used to describe patterns in text, and on the Select-String cmdlet, which makes heavy use of regular expression.

Next time, we will look at the operators -split and -match, the switch statement (which is surprisingly useful for text parsing), and the regex class.

Staffan Gustafsson, @StaffanGson, github

Thanks to Jason Shirk, Mathias Jessen and Steve Lee for reviews and feedback.

Announcing the PowerShell Preview Extension in VSCode

$
0
0

Preview builds of the PowerShell extension are now available in VSCode

We are excited to announce the PowerShell Preview extension in the VSCode marketplace!
The PowerShell Preview extension allows users on Windows PowerShell 5.1, Powershell 6.0, and all newer versions to get and test the latest updates to the PowerShell extension and comes with some exciting features. The PowerShell Preview extension is a substitute for the PowerShell extension so both the PowerShell extension and the PowerShell Preview
extension should not be enabled at the same time.

Features of the PowerShell Preview extension

The PowerShell Preview extension is built on .NET Standard thereby enabling simplification of our code and dependency structure.

The PowerShell Preview extension also contains PSReadLine support in the integrated console for Windows behind a
feature flag. PSReadLine provides a consistent and rich interactive experience, including syntax coloring and
multi-line editing and history, in the PowerShell console, in Cloud Shell, and now in VSCode terminal.
For more information on the benefits of PSReadLine, check out their documentation.

To enable PSReadLine support in the Preview version on Windows, please add the following to your user settings:

"powershell.developer.featureFlags": [ "PSReadLine" ]

HUGE thanks to @SeeminglyScience for all his amazing work getting PSReadLine working in PowerShell Editor Services!

Why we created the PowerShell Preview extension

By having a preview channel, which supports Windows Powershell 5.1 and PowerShell Core 6, in addition to our existing stable channel, we can get new features out faster. PSReadLine support for the VSCode integrated console is a great
example of a feature that the preview build makes possible. Having a preview channel also allows us to get more feedback
on new features and to iterate on changes before they arrive in our stable channel.

How to Get/Use the PowerShell Preview extension

If you dont already have VSCode, start here.

Once you have VSCode open, click Clt+Shift+X to open the extensions marketplace.
Next, type PowerShell Preview in the search bar.
Click Install on the PowerShell Preview page.
Finally, click Reload in order to refresh VSCode.

If you already have the PowerShell extension please disable it to use the Powershell Preview extension.
To disable the PowerShell extension find it in the Extensions sidebar view, specifically under the list of Enabled extensions, Right-click on the PowerShell extension and select Disable. Please note that it is important to only have either the
PowerShell extension or the PowerShell Preview extension endabled at one time.

Breaking Changes

As stated above, this version of the PowerShell extension only works with Windows PowerShell versions 5.1 and
PowerShell Core 6.

Reporting Feedback

An important benefit of a preview extension is getting feedback from users.
To report issues with the extension use our GitHub repo.
When reporting issues be sure to specify the version of the extension you are using.

Sydney Smith
Program Manager
PowerShell Team

The PowerShell-Docs repositories have been moved

$
0
0

The PowerShell-Docs repositories have been moved from the PowerShell organization to the MicrosoftDocs organization in GitHub.

The tools we use to build the documentation are designed to work in the MicrosoftDocs org. Moving the repository lets us build the foundation for future improvements in our documentation experience.

Impact of the move

During the move there may be some downtime. The affected repositories will be inaccessible during
the move process. Also, the documentation processes will be paused. After the move, we need to test
access permissions and automation scripts.
After these tasks are complete, access and operations will return to normal. GitHub automatically
redirects requests to the old repo URL to the new location.
For more information about transferring repositories in GitHub, see About repository transfers

If the transferred repository has any forks, then those forks will remain associated with the
repository after the transfer is complete.
  • All Git information about commits, including contributions, are preserved.
  • All of the issues and pull requests remain intact when transferring a repository.
  • All links to the previous repository location are automatically redirected to the new location.


When you use git clone, git fetch, or git push on a transferred repository, these commands will r
edirect to the new repository location or URL.

However, to avoid confusion, we strongly recommend updating any existing local clones to point to
the new repository URL.
For more information, see Changing a remote’s URL.

The following example shows how to change the “upstream” remote to point to the new location:

[Wed 06:08PM] [staging =]
PS C:\Git\PS-Docs\PowerShell-Docs> git remote -v
origin  https://github.com/sdwheeler/PowerShell-Docs.git (fetch)
origin  https://github.com/sdwheeler/PowerShell-Docs.git (push)
upstream        https://github.com/PowerShell/PowerShell-Docs.git (fetch)
upstream        https://github.com/PowerShell/PowerShell-Docs.git (push)

[Wed 06:09PM] [staging =]
PS C:\Git\PS-Docs\PowerShell-Docs> git remote set-url upstream https://github.com/MicrosoftDocs/PowerShell-Docs.git

[Wed 06:10PM] [staging =]
PS C:\Git\PS-Docs\PowerShell-Docs> git remote -v
origin  https://github.com/sdwheeler/PowerShell-Docs.git (fetch)
origin  https://github.com/sdwheeler/PowerShell-Docs.git (push)
upstream        https://github.com/MicrosoftDocs/PowerShell-Docs.git (fetch)
upstream        https://github.com/MicrosoftDocs/PowerShell-Docs.git (push)


Which repositories were moved?

 

The following repositories were transferred:

  • PowerShell/PowerShell-Docs
  • PowerShell/powerShell-Docs.cs-cz
  • PowerShell/powerShell-Docs.de-de
  • PowerShell/powerShell-Docs.es-es
  • PowerShell/powerShell-Docs.fr-fr
  • PowerShell/powerShell-Docs.hu-hu
  • PowerShell/powerShell-Docs.it-it
  • PowerShell/powerShell-Docs.ja-jp
  • PowerShell/powerShell-Docs.ko-kr
  • PowerShell/powerShell-Docs.nl-nl
  • PowerShell/powerShell-Docs.pl-pl
  • PowerShell/powerShell-Docs.pt-br
  • PowerShell/powerShell-Docs.pt-pt
  • PowerShell/powerShell-Docs.ru-ru
  • PowerShell/powerShell-Docs.sv-se
  • PowerShell/powerShell-Docs.tr-tr
  • PowerShell/powerShell-Docs.zh-cn
  • PowerShell/powerShell-Docs.zh-tw

Call to action

If you have a fork that you cloned, change your remote configuration to point to the new upstream URL.
Help us make the documentation better.
  • Submit issues when you find a problem in the docs.
  • Suggest fixes to documentation by submitting changes through the PR process.
Sean Wheeler
Senior Content Developer for PowerShell
https://github.com/sdwheeler

Parsing Text with PowerShell (2/3)

$
0
0

This is the second post in a three-part series.

  • Part 1:
    • Useful methods on the String class
    • Introduction to Regular Expressions
    • The Select-String cmdlet
  • Part 2:
    • the -split operator
    • the -match operator
    • the switch statement
    • the Regex class
  • Part 3:
    • a real world, complete and slightly bigger, example of a switch-based parser

The -split operator

The -split operator splits one or more strings into substrings.

The first example is a name-value pattern, which is a common parsing task. Note the usage of the Max-substrings parameter to the -split operator.
We want to ensure that it doesn’t matter if the value contains the character to split on.

$text = "Description=The '=' character is used for assigning values to a variable"
$name, $value = $text -split "=", 2

@"
Name  =  $name
Value =  $value
"@
Name  =  Description
Value =  The '=' character is used for assigning values to a variable

When the line to parse contains fields separated by a well known separator, that is never a part of the field values, we can use the -split operator in combination with multiple assignment to get the fields into variables.

$name, $location, $occupation = "Spider Man,New York,Super Hero" -split ','

If only the location is of interest, the unwanted items can be assigned to $null.

$null, $location, $null = "Spider Man,New York,Super Hero" -split ','

$location
New York

If there are many fields, assigning to null doesn’t scale well. Indexing can be used instead, to get the fields of interest.

$inputText = "x,Staffan,x,x,x,x,x,x,x,x,x,x,Stockholm,x,x,x,x,x,x,x,x,11,x,x,x,x"
$name, $location, $age = ($inputText -split ',')[1,12,21]

$name
$location
$age
Staffan
Stockholm
11

It is almost always a good idea to create an object that gives context to the different parts.

$inputText = "x,Steve,x,x,x,x,x,x,x,x,x,x,Seattle,x,x,x,x,x,x,x,x,22,x,x,x,x"
$name, $location, $age = ($inputText -split ',')[1,12,21]
[PSCustomObject] @{
    Name = $name
    Location = $location
    Age = [int] $age
}
Name  Location Age
----  -------- ---
Steve Seattle   22

Instead of creating a PSCustomObject, we can create a class. It’s a bit more to type, but we can get more help from the engine, for example with tab completion.

The example below also shows an example of type conversion, where the default string to number conversion doesn’t work.
The age field is handled by PowerShell’s built-in type conversion. It is of type [int], and PowerShell will handle the conversion from string to int,
but in some cases we need to help out a bit. The ShoeSize field is also an [int], but the data is hexadecimal,
and without the hex specifier (‘0x’), this conversion fails for some values, and provides incorrect results for the others.

class PowerSheller {
    [string] $Name
    [string] $Location
    [int] $Age
    [int] $ShoeSize
}

$inputText = "x,Staffan,x,x,x,x,x,x,x,x,x,x,Stockholm,x,x,x,x,x,x,x,x,33,x,11d,x,x"
$name, $location, $age, $shoeSize = ($inputText -split ',')[1,12,21,23]
[PowerSheller] @{
    Name = $name
    Location = $location
    Age = $age
    # ShoeSize is expressed in hex, with no '0x' because reasons :)
    # And yes, it's in millimeters.
    ShoeSize = [Convert]::ToInt32($shoeSize, 16)
}
Name    Location  Age ShoeSize
----    --------  --- --------
Staffan Stockholm  33      285

The split operator’s first argument is actually a regex (by default, can be changed with options).
I use this on long command lines in log files (like those given to compilers) where there can be hundreds of options specified. This makes it hard to see if a certain option is specified or not, but when split into their own lines, it becomes trivial.
The pattern below uses a positive lookahead assertion.
It can be very useful to make patterns match only in a given context, like if they are, or are not, preceded or followed by another pattern.

$cmdline = "cl.exe /D Bar=1 /I SomePath /D Foo  /O2 /I SomeOtherPath /Debug a1.cpp a3.cpp a2.cpp"

$cmdline -split "\s+(?=[-/])"
cl.exe
/D Bar=1
/I SomePath
/D Foo
/O2
/I SomeOtherPath
/Debug a1.cpp a2.cpp

Breaking down the regex, by rewriting it with the x option:

(?x)      # ignore whitespace in the pattern, and enable comments after '#'
\s+       # one or more spaces
(?=[-/])  # only match the previous spaces if they are followed by any of '-' or '/'.

Splitting with a scriptblock

The -split operator also comes in another form, where you can pass it a scriptblock instead of a regular expression.
This allows for more complicated logic, that can be hard or impossible to express as a regular expression.

The scriptblock accepts two parameters, the text to split and the current index. $_ is bound to the character at the current index.

function SplitWhitespaceInMiddleOfText {
    param(
        [string]$Text,
        [int] $Index
    )
    if ($Index -lt 10 -or $Index -gt 40){
        return $false
    }
    $_ -match '\s'
}

$inputText = "Some text that only needs splitting in the middle of the text"
$inputText -split $function:SplitWhitespaceInMiddleOfText
Some text that
only
needs
splitting
in
the middle of the text

The $function:SplitWhitespaceInMiddleOfText syntax is a way to get to content (the scriptblock that implements it) of the function, just as $env:UserName gets the content of an item in the env: drive.
It provides a way to document and/or reuse the scriptblock.

The -match operator

The -match operator works in conjunction with the $matches automatic variable. Each time a -match or a -notmatch succeeds, the $matches variable is populated so that each capture group gets its own entry. If the capture group is named, the key will be the name of the group, otherwise it will be the index.

As an example:

if ('a b c' -match '(\w) (?<named>\w) (\w)'){
    $matches
}
Name                           Value
----                           -----
named                          b
2                              c
1                              a
0                              a b c

Notice that the indices only increase on groups without names. I.E. the indices of later groups change when a group is named.

Armed with the regex knowledge from the earlier post, we can write the following:

PS> "    10,Some text" -match '^\s+(\d+),(.+)'
True
PS> $matches
Name                           Value
----                           -----
2                              Some text
1                              10
0                              10,Some text

or with named groups

PS> "    10,Some text" -match '^\s+(?<num>\d+),(?<text>.+)'
True
PS> $matches
Name                           Value
----                           -----
num                            10
text                           Some text
0                              10,Some text

The important thing here is to put parentheses around the parts of the pattern that we want to extract. That is what creates the capture groups that allow us to reference those parts of the matching text, either by name or by index.

Combining this into a function makes it easy to use:

function ParseMyString($text){
    if ($text -match '^\s+(\d+),(.+)') {
        [PSCustomObject] @{
            Number = [int] $matches[1]
            Text    = $matches[2]
        }
    }
    else {
        Write-Warning "ParseMyString: Input `$text` doesn't match pattern"
    }
}

ParseMyString "    10,Some text"
Number  Text
------- ----
     10 Some text

Notice the type conversion when assigning the Number property. As long as the number is in range of an integer, this will always succeed, since we have made a successful match in the if statement above. ([long] or [bigint] could be used. In this case I provide the input, and I have promised myself to stick to a range that fits in a 32-bit integer.)
Now we will be able to sort or do numerical operations on the Number property, and it will behave like we want it to – as a number, not as a string.

The switch statement

Now we’re at the big guns 🙂

The switch statement in PowerShell has been given special functionality for parsing text.
It has two flags that are useful for parsing text and files with text in them. -regex and -file.

When specifying -regex, the match clauses that are strings are treated as regular expressions. The switch statement also sets the $matches automatic variable.

When specifying -file, PowerShell treats the input as a file name, to read input from, rather than as a value statement.

Note the use of a ScriptBlock instead of a string as the match clause to determine if we should skip preamble lines.

class ParsedOutput {
    [int] $Number
    [string] $Text

    [string] ToString() { return "{0} ({1})" -f $this.Text, $this.Number }
}

$inputData =
    "Preamble line",
    "LastLineOfPreamble",
    "    10,Some Text",
    "    Some other text,20"

$inPreamble = $true
switch -regex ($inputData) {

    {$inPreamble -and $_ -eq 'LastLineOfPreamble'} { $inPreamble = $false; continue }

    "^\s+(?<num>\d+),(?<text>.+)" {  # this matches the first line of non-preamble input
        [ParsedOutput] @{
            Number = $matches.num
            Text = $matches.text
        }
        continue
    }

    "^\s+(?<text>[^,]+),(?<num>\d+)" { # this matches the second line of non-preamble input
        [ParsedOutput] @{
            Number = $matches.num
            Text = $matches.text
        }
        continue
    }
}
Number  Text
------ ----
    10 Some Text
    20 Some other text

The pattern [^,]+ in the text group in the code above is useful. It means match anything that is not a comma ,. We are using the any-of construct [], and within those brackets, ^ changes meaning from the beginning of the line to anything but.

That is useful when we are matching delimited fields. A requirement is that the delimiter cannot be part of the set of allowed field values.

The regex class

regex is a type accelerator for System.Text.RegularExpressions.Regex. It can be useful when porting code from C#, and sometimes when we want to get more control in situations when we have many matches of a capture group. It also allows us to pre-create the regular expressions which can matter in performance sensitive scenarios, and to specify a timeout.

One instance where the regex class is needed is when you have multiple captures of a group.

Consider the following:

Text Pattern
a,b,c, (\w,)+

If the match operator is used, $matches will contain

Name                           Value
----                           -----
1                              c,
0                              a,b,c,

The pattern matched three times, for a,, b, and c,. However, only the last match is preserved in the $matches dictionary.
However, the following will allow us to get to all the captures of the group:

[regex]::match('a,b,c,', '(\w,)+').Groups[1].Captures
Index Length Value
----- ------ -----
    0      2 a,
    2      2 b,
    4      2 c,

Below is an example that uses the members of the Regex class to parse input data

class ParsedOutput {
    [int] $Number
    [string] $Text

    [string] ToString() { return "{0} ({1})" -f $this.Text, $this.Number }
}

$inputData =
    "    10,Some Text",
    "    Some other text,20"  # this text will not match

[regex] $pattern = "^\s+(\d+),(.+)"

foreach($d in $inputData){
    $match = $pattern.Match($d)
    if ($match.Success){
        $number, $text = $match.Groups[1,2].Value
        [ParsedOutput] @{
            Number = $number
            Text = $text
        }
    }
    else {
        Write-Warning "regex: '$d' did not match pattern '$pattern'"
    }
}
WARNING: regex: '    Some other text,20' did not match pattern '^\s+(\d+),(.+)'
Number Text
------ ----
    10 Some Text

It may surprise you that the warning appears before the output. PowerShell has a quite complex formatting system at the end of the pipeline, which treats pipeline output different than other streams. Among other things, it buffers output in the beginning of a pipeline to calculate sensible column widths. This works well in practice, but sometimes gives strange reordering of output on different streams.

Summary

In this post we have looked at how the -split operator can be used to split a string in parts, how the -match operator can be used to extract different patterns from some text, and how the powerful switch statement can be used to match against multiple patterns.

We ended by looking at how the regex class, which in some cases provides a bit more control, but at the expense of ease of use. This concludes the second part of this series. Next time, we will look at a complete, real world, example of a switch-based parser.

Thanks to Jason Shirk, Mathias Jessen and Steve Lee for reviews and feedback.

Staffan Gustafsson, @StaffanGson, powercode@github

Staffan works at DICE in Stockholm, Sweden, as a Software Engineer and has been using PowerShell since the first public beta.
He was most seriously pleased when PowerShell was open sourced, and has since contributed bug fixes, new features and performance improvements.
Staffan is a speaker at PSConfEU and is always happy to talk PowerShell.

Parsing Text with PowerShell (3/3)

$
0
0

This is the third and final post in a three-part series.

  • Part 1:
    • Useful methods on the String class
    • Introduction to Regular Expressions
    • The Select-String cmdlet
  • Part 2:
    • the -split operator
    • the -match operator
    • the switch statement
    • the Regex class
  • Part 3:
    • a real world, complete and slightly bigger, example of a switch-based parser
      • General structure of a switch-based parser
      • The real world example

In the previous posts, we looked at the different operators what are available to us in PowerShell.

When analyzing crashes at DICE, I noticed that some of the C++ runtime binaries where missing debug symbols. They should be available for download from Microsoft’s public symbol server, and most versions were there. However, due to some process errors at DevDiv, some builds were released publicly without available debug symbols.
In some cases, those missing symbols prevented us from debugging those crashes, and in all cases, they triggered my developer OCD.

So, to give actionable feedback to Microsoft, I scripted a debugger (cdb.exe in this case) to give a verbose list of the loaded modules, and parsed the output with PowerShell, which was also later used to group and filter the resulting data set. I sent this data to Microsoft, and 5 days later, the missing symbols were available for download. Mission accomplished!

This post will describe the parser I wrote for this task (it turned out that I had good use for it for other tasks later), and the general structure is applicable to most parsing tasks.

The example will show how a switch-based parser would look when the input data isn’t as tidy as it normally is in examples, but messy – as the real world data often is.

General Structure of a switch Based Parser

Depending on the structure of our input, the code must be organized in slightly different ways.

Input may have a record start that differs by indentation or some distinct token like

Foo                    <- Record start - No whitespace at the beginning of the line
    Prop1=Staffan      <- Properties for the record - starts with whitespace
    Prop3 =ValueN
Bar
    Prop1=Steve
    Prop2=ValueBar2

If the data to be parsed has an explicit start record, it is a bit easier than if it doesn’t have one.
We create a new data object when we get a record start, after writing any previously created object to the pipeline.
At the end, we need to check if we have parsed a record that hasn’t been written to the pipeline.

The general structure of a such a switch-based parser can be as follows:

$inputData = @"
Foo
    Prop1=Value1
    Prop3=Value3
Bar
    Prop1=ValueBar1
    Prop2=ValueBar2
"@ -split '\r?\n'   # This regex is useful to split at line endings, with or without carriage return

class SomeDataClass {
    $ID
    $Name
    $Property2
    $Property3
}

# map to project input property names to the properties on our data class
$propertyNameMap = @{
    Prop1 = "Name"
    Prop2 = "Property2"
    Prop3 = "Property3"
}

$currentObject = $null
switch -regex ($inputData) {

    '^(\S.*)' {
        # record start pattern, in this case line that doesn't start with a whitespace.
        if ($null -ne $currentObject) {
            $currentObject                   # output to pipeline if we have a previous data object
        }
        $currentObject = [SomeDataClass] @{  # create new object for this record
            Id = $matches.1                  # with Id like Foo or Bar
        }
        continue
    }

    # set the properties on the data object
    '^\s+([^=]+)=(.*)' {
        $name, $value = $matches[1, 2]
        # project property names
        $propName = $propertyNameMap[$name]
        if ($propName = $null) {
            $propName = $name
        }
        # assign the parsed value to the projected property name
        $currentObject.$propName = $value
        continue
    }
}

if ($currentObject) {
    # Handle the last object if any
    $currentObject # output to pipeline
}
ID  Name      Property2 Property3
--  ----      --------- ---------
Foo Value1              Value3
Bar ValueBar1 ValueBar2

Alternatively, we may have input where the records are separated by a blank line, but without any obvious record start.

commitId=1234                         <- In this case, a commitId is first in a record
description=Update readme.md
                                      <- the blank line separates records
user=Staffan                          <- For this record, a user property comes first
commitId=1235
description=Fix bug.md

In this case the structure of the code looks a bit different. We create an object at the beginning, but keep track of if it’s dirty or not.
If we get to the end with a dirty object, we must output it.

$inputData = @"

commit=1234
desc=Update readme.md

user=Staffan
commit=1235
desc=Bug fix

"@ -split "\r?\n"

class SomeDataClass {
    [int] $CommitId
    [string] $Description
    [string] $User
}

# map to project input property names to the properties on our data class
# we only need to provide the ones that are different. 'User' works fine as it is.
$propertyNameMap = @{
    commit = "CommitId"
    desc   = "Description"
}

$currentObject = [SomeDataClass]::new()
$objectDirty = $false
switch -regex ($inputData) {
    # set the properties on the data object
    '^([^=]+)=(.*)$' {
        # parse a name/value
        $name, $value = $matches[1, 2]
        # project property names
        $propName = $propertyNameMap[$name]
        if ($null -eq $propName) {
            $propName = $name
        }
        # assign the projected property
        $currentObject.$propName = $value
        $objectDirty = $true
        continue
    }

    '^\s*$' {
        # separator pattern, in this case any blank line
        if ($objectDirty) {
            $currentObject                           # output to pipeline
            $currentObject = [SomeDataClass]::new()  # create new object
            $objectDirty = $false                    # and mark it as not dirty
        }
    }
    default {
        Write-Warning "Unexpected input: '$_'"
    }
}

if ($objectDirty) {
    # Handle the last object if any
    $currentObject # output to pipeline
}
CommitId Description      User
-------- -----------      ----
    1234 Update readme.md
    1235 Bug fix          Staffan

The Real World Example

I have adapted this sample slightly so that I get the loaded modules from a running process instead of from my crash dumps. The format of the output from the debugger is the same.
The following command launches a command line debugger on notepad, with a script that gives a verbose listing of the loaded modules, and quits:

# we need to muck around with the console output encoding to handle the trademark chars
# imagine no encodings
# it's easy if you try
# no code pages below us
# above us only sky
[Console]::OutputEncoding = [System.Text.Encoding]::GetEncoding("iso-8859-1")

$proc = Start-Process notepad -passthru
Start-Sleep -seconds 1
$cdbOutput = cdb -y 'srv*c:\symbols*http://msdl.microsoft.com/download/symbols' -c ".reload -f;lmv;q" -p $proc.ProcessID

The output of the command above is here for those who want to follow along but who aren’t running windows or don’t have cdb.exe installed.

The (abbreviated) output looks like this:

Microsoft (R) Windows Debugger Version 10.0.16299.15 AMD64
Copyright (c) Microsoft Corporation. All rights reserved.

*** wait with pending attach

************* Path validation summary **************
Response                         Time (ms)     Location
Deferred                                       srv*c:\symbols*http://msdl.microsoft.com/download/symbols
Symbol search path is: srv*c:\symbols*http://msdl.microsoft.com/download/symbols
Executable search path is:
ModLoad: 00007ff6`e9da0000 00007ff6`e9de3000   C:\Windows\system32\notepad.exe
...
ModLoad: 00007ffe`97d80000 00007ffe`97db1000   C:\WINDOWS\SYSTEM32\ntmarta.dll
(98bc.40a0): Break instruction exception - code 80000003 (first chance)
ntdll!DbgBreakPoint:
00007ffe`9cd53050 cc              int     3
0:007> cdb: Reading initial command '.reload -f;lmv;q'
Reloading current modules
.....................................................
start             end                 module name
00007ff6`e9da0000 00007ff6`e9de3000   notepad    (pdb symbols)          c:\symbols\notepad.pdb\2352C62CDF448257FDBDDA4081A8F9081\notepad.pdb
    Loaded symbol image file: C:\Windows\system32\notepad.exe
    Image path: C:\Windows\system32\notepad.exe
    Image name: notepad.exe
    Image was built with /Brepro flag.
    Timestamp:        329A7791 (This is a reproducible build file hash, not a timestamp)
    CheckSum:         0004D15F
    ImageSize:        00043000
    File version:     10.0.17763.1
    Product version:  10.0.17763.1
    File flags:       0 (Mask 3F)
    File OS:          40004 NT Win32
    File type:        1.0 App
    File date:        00000000.00000000
    Translations:     0409.04b0
    CompanyName:      Microsoft Corporation
    ProductName:      Microsoft??? Windows??? Operating System
    InternalName:     Notepad
    OriginalFilename: NOTEPAD.EXE
    ProductVersion:   10.0.17763.1
    FileVersion:      10.0.17763.1 (WinBuild.160101.0800)
    FileDescription:  Notepad
    LegalCopyright:   ??? Microsoft Corporation. All rights reserved.
...
00007ffe`9ccb0000 00007ffe`9ce9d000   ntdll      (pdb symbols)          c:\symbols\ntdll.pdb\B8AD79538F2730FD9BACE36C9F9316A01\ntdll.pdb
    Loaded symbol image file: C:\WINDOWS\SYSTEM32\ntdll.dll
    Image path: C:\WINDOWS\SYSTEM32\ntdll.dll
    Image name: ntdll.dll
    Image was built with /Brepro flag.
    Timestamp:        E8B54827 (This is a reproducible build file hash, not a timestamp)
    CheckSum:         001F20D1
    ImageSize:        001ED000
    File version:     10.0.17763.194
    Product version:  10.0.17763.194
    File flags:       0 (Mask 3F)
    File OS:          40004 NT Win32
    File type:        2.0 Dll
    File date:        00000000.00000000
    Translations:     0409.04b0
    CompanyName:      Microsoft Corporation
    ProductName:      Microsoft??? Windows??? Operating System
    InternalName:     ntdll.dll
    OriginalFilename: ntdll.dll
    ProductVersion:   10.0.17763.194
    FileVersion:      10.0.17763.194 (WinBuild.160101.0800)
    FileDescription:  NT Layer DLL
    LegalCopyright:   ??? Microsoft Corporation. All rights reserved.
quit:

The output starts with info that I’m not interested in here. I only want to get the detailed information about the loaded modules. It is not until the line

start             end                 module name

that I care about the output.

Also, at the end there is a line that we need to be aware of:

quit:

that is not part of the module output.

To skip the parts of the debugger output that we don’t care about, we have a boolean flag initially set to true.
If that flag is set, we check if the current line, $_, is the module header in which case we flip the flag.

    $inPreamble = $true
    switch -regex ($cdbOutput) {

        { $inPreamble -and $_ -eq "start             end                 module name" } { $inPreamble = $false; continue }

I have made the parser a separate function that reads its input from the pipeline. This way, I can use the same function to parse module data, regardless of how I got the module data. Maybe it was saved on a file. Or came from a dump, or a live process. It doesn’t matter, since the parser is decoupled from the data retrieval.

After the sample, there is a breakdown of the more complicated regular expressions used, so don’t despair if you don’t understand them at first.
Regular Expressions are notoriously hard to read, so much so that they make Perl look readable in comparison.

# define an class to store the data
class ExecutableModule {
    [string]   $Name
    [string]   $Start
    [string]   $End
    [string]   $SymbolStatus
    [string]   $PdbPath
    [bool]     $Reproducible
    [string]   $ImagePath
    [string]   $ImageName
    [DateTime] $TimeStamp
    [uint32]   $FileHash
    [uint32]   $CheckSum
    [uint32]   $ImageSize
    [version]  $FileVersion
    [version]  $ProductVersion
    [string]   $FileFlags
    [string]   $FileOS
    [string]   $FileType
    [string]   $FileDate
    [string[]] $Translations
    [string]   $CompanyName
    [string]   $ProductName
    [string]   $InternalName
    [string]   $OriginalFilename
    [string]   $ProductVersionStr
    [string]   $FileVersionStr
    [string]   $FileDescription
    [string]   $LegalCopyright
    [string]   $LegalTrademarks
    [string]   $LoadedImageFile
    [string]   $PrivateBuild
    [string]   $Comments
}

<#
.SYNOPSIS Runs a debugger on a program to dump its loaded modules
#>
function Get-ExecutableModuleRawData {
    param ([string] $Program)
    $consoleEncoding = [Console]::OutputEncoding
    [Console]::OutputEncoding = [System.Text.Encoding]::GetEncoding("iso-8859-1")
    try {
        $proc = Start-Process $program -PassThru
        Start-Sleep -Seconds 1  # sleep for a while so modules are loaded
        cdb -y srv*c:\symbols*http://msdl.microsoft.com/download/symbols -c ".reload -f;lmv;q" -p $proc.Id
        $proc.Close()
    }
    finally {
        [Console]::OutputEncoding = $consoleEncoding
    }
}

<#
.SYNOPSIS Converts verbose module data from windows debuggers into ExecutableModule objects.
#>
function ConvertTo-ExecutableModule {
    [OutputType([ExecutableModule])]
    param (
        [Parameter(ValueFromPipeline)]
        [string[]] $ModuleRawData
    )
    begin {
        $currentObject = $null
        $preamble = $true
        $propertyNameMap = @{
            'File flags'      = 'FileFlags'
            'File OS'         = 'FileOS'
            'File type'       = 'FileType'
            'File date'       = 'FileDate'
            'File version'    = 'FileVersion'
            'Product version' = 'ProductVersion'
            'Image path'      = 'ImagePath'
            'Image name'      = 'ImageName'
            'FileVersion'     = 'FileVersionStr'
            'ProductVersion'  = 'ProductVersionStr'
        }
    }
    process {
        switch -regex ($ModuleRawData) {

            # skip lines until we get to our sentinel line
            { $preamble -and $_ -eq "start             end                 module name" } { $preamble = $false; continue }

            #00007ff6`e9da0000 00007ff6`e9de3000   notepad    (deferred)
            #00007ffe`9ccb0000 00007ffe`9ce9d000   ntdll      (pdb symbols)          c:\symbols\ntdll.pdb\B8AD79538F2730FD9BACE36C9F9316A01\ntdll.pdb
            '^([0-9a-f`]{17})\s([0-9a-f`]{17})\s+(\S+)\s+\(([^\)]+)\)\s*(.+)?' {
                # see breakdown of the expression later in the post
                # on record start, output the currentObject, if any is set
                if ($null -ne $currentObject) {
                    $currentObject
                }
                $start, $end, $module, $pdbKind, $pdbPath = $matches[1..5]
                # create an instance of the object that we are adding info from the current record into.
                $currentObject = [ExecutableModule] @{
                    Start        = $start
                    End          = $end
                    Name         = $module
                    SymbolStatus = $pdbKind
                    PdbPath      = $pdbPath
                }
                continue
            }
            '^\s+Image was built with /Brepro flag.' {
                $currentObject.Reproducible = $true
                continue
            }
            '^\s+Timestamp:\s+[^\(]+\((?<timestamp>.{8})\)' {
                # see breakdown of the regular  expression later in the post
                # Timestamp:        Mon Jan  7 23:42:30 2019 (5C33D5D6)
                $intValue = [Convert]::ToInt32($matches.timestamp, 16)
                $currentObject.TimeStamp = [DateTime]::new(1970, 01, 01, 0, 0, 0, [DateTimeKind]::Utc).AddSeconds($intValue)
                continue
            }
            '^\s+TimeStamp:\s+(?<value>.{8}) \(This' {
                # Timestamp:        E78937AC (This is a reproducible build file hash, not a timestamp)
                $currentObject.FileHash = [Convert]::ToUInt32($matches.value, 16)
                continue
            }
            '^\s+Loaded symbol image file: (?<imageFile>[^\)]+)' {
                $currentObject.LoadedImageFile = $matches.imageFile
                continue
            }
            '^\s+Checksum:\s+(?<checksum>\S+)' {
                $currentObject.Checksum = [Convert]::ToUInt32($matches.checksum, 16)
                continue
            }
            '^\s+Translations:\s+(?<value>\S+)' {
                $currentObject.Translations = $matches.value.Split(".")
                continue
            }
            '^\s+ImageSize:\s+(?<imageSize>.{8})' {
                $currentObject.ImageSize = [Convert]::ToUInt32($matches.imageSize, 16)
                continue
            }
            '^\s{4}(?<name>[^:]+):\s+(?<value>.+)' {
                # see breakdown of the regular expression later in the post
                # This part is any 'name: value' pattern
                $name, $value = $matches['name', 'value']

                # project the property name
                $propName = $propertyNameMap[$name]
                $propName = if ($null -eq $propName) { $name } else { $propName }

                # note the dynamic property name in the assignment
                # this will fail if the property doesn't have a member with the specified name
                $currentObject.$propName = $value
                continue
            }
            'quit:' {
                # ignore and exit
                break
            }
            default {
                # When writing the parser, it can be useful to include a line like the one below to see the cases that are not handled by the parser
                # Write-Warning "missing case for '$_'. Unexpected output format from cdb.exe"

                continue # skip lines that doesn't match the patterns we are interested in, like the start/end/modulename header and the quit: output
            }
        }
    }
    end {
        # this is needed to output the last object
        if ($null -ne $currentObject) {
            $currentObject
        }
    }
}


Get-ExecutableModuleRawData Notepad |
    ConvertTo-ExecutableModule |
    Sort-Object ProductVersion, Name
    Format-Table -Property Name, FileVersion, Product_Version, FileDescription
Name               FileVersionStr                             ProductVersion FileDescription
----               --------------                             -------------- ---------------
PROPSYS            7.0.17763.1 (WinBuild.160101.0800)         7.0.17763.1    Microsoft Property System
ADVAPI32           10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Advanced Windows 32 Base API
bcrypt             10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Windows Cryptographic Primitives Library
...
uxtheme            10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Microsoft UxTheme Library
win32u             10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Win32u
WINSPOOL           10.0.17763.1 (WinBuild.160101.0800)        10.0.17763.1   Windows Spooler Driver
KERNELBASE         10.0.17763.134 (WinBuild.160101.0800)      10.0.17763.134 Windows NT BASE API Client DLL
wintypes           10.0.17763.134 (WinBuild.160101.0800)      10.0.17763.134 Windows Base Types DLL
SHELL32            10.0.17763.168 (WinBuild.160101.0800)      10.0.17763.168 Windows Shell Common Dll
...
windows_storage    10.0.17763.168 (WinBuild.160101.0800)      10.0.17763.168 Microsoft WinRT Storage API
CoreMessaging      10.0.17763.194                             10.0.17763.194 Microsoft CoreMessaging Dll
gdi32full          10.0.17763.194 (WinBuild.160101.0800)      10.0.17763.194 GDI Client DLL
ntdll              10.0.17763.194 (WinBuild.160101.0800)      10.0.17763.194 NT Layer DLL
RMCLIENT           10.0.17763.194 (WinBuild.160101.0800)      10.0.17763.194 Resource Manager Client
RPCRT4             10.0.17763.194 (WinBuild.160101.0800)      10.0.17763.194 Remote Procedure Call Runtime
combase            10.0.17763.253 (WinBuild.160101.0800)      10.0.17763.253 Microsoft COM for Windows
COMCTL32           6.10 (WinBuild.160101.0800)                10.0.17763.253 User Experience Controls Library
urlmon             11.00.17763.168 (WinBuild.160101.0800)     11.0.17763.168 OLE32 Extensions for Win32
iertutil           11.00.17763.253 (WinBuild.160101.0800)     11.0.17763.253 Run time utility for Internet Explorer

Regex pattern breakdown

Here is a breakdown of the more complicated patterns, using the ignore pattern whitespace modifier x:

([0-9a-f`]{17})\s([0-9a-f`]{17})\s+(\S+)\s+\(([^\)]+)\)\s*(.+)?

# example input: 00007ffe`9ccb0000 00007ffe`9ce9d000   ntdll      (pdb symbols)          c:\symbols\ntdll.pdb\B8AD79538F2730FD9BACE36C9F9316A01\ntdll.pdb

(?x)                # ignore pattern whitespace
^                   # the beginning of the line
([0-9a-f`]{17})     # capture expression like 00007ff6`e9da0000 - any hex number or backtick, and exactly 17 of them
\s                  # a space
([0-9a-f`]{17})     # capture expression like 00007ff6`e9da0000 - any hex number or backtick, and exactly 17 of them
\s+                 # skip any number of spaces
(\S+)               # capture until we get a space - this would match the 'ntdll' part
\s+                 # skip one or more spaces
\(                  # start parenthesis
    ([^\)])         # capture anything but end parenthesis
\)                  # end parenthesis
\s*                 # skip zero or more spaces
(.+)?               # optionally capture any symbol file path

Breakdown of the name-value pattern:

^\s+(?<name>[^:]+):\s+(?<value>.+)

# example input:  File flags:       0 (Mask 3F)

(?x)                # ignore pattern whitespace
^                   # the beginning of the line
\s+                 # require one or more spaces
(?<name>[^:]+)      # capture anything that is not a `:` into the named group "name"
:                   # require a comma
\s+                 # require one or more spaces
(?<value>.+)        # capture everything until the end into the name group "value"

Breakdown of the timestamp pattern:

^\s{4}Timestamp:\s+[^\(]+\((?<timestamp>.{8})\)

#example input:     Timestamp:        Mon Jan  7 23:42:30 2019 (5C33D5D6)

(?x)                # ignore pattern whitespace
^                   # the beginning of the line
\s+                 # require one or more spaces
Timestamp:          # The literal text 'Timestamp:'
\s+                 # require one or more spaces
[^\(]+              # one or more of anything but a open parenthesis
\(                  # a literal '('
(?<timestamp>.{8})  # 8 characters of anything, captured into the group 'timestamp'
\)                  # a literal ')'

Gotchas – the Regex Cache

Something that can happen if you are writing a more complicated parser is the following:
The parser works well. You have 15 regular expressions in your switch statement and then you get some input you haven’t seen before, so you add a 16th regex.
All of a sudden, the performance of your parser tanks. WTF?

The .net regex implementation has a cache of recently used regexs. You can check the size of it like this:

PS> [regex]::CacheSize
15

# bump it
[regex]::CacheSize = 20

And now your parser is fast(er) again.

Bonus tip

I frequently use PowerShell to write (generate) my code:

Get-ExecutableModuleRawData pwsh |
    Select-String '^\s+([^:]+):' |       # this pattern matches the module detail fields
    Foreach-Object {$_.matches.groups[1].value} |
    Select-Object -Unique |
    Foreach-Object -Begin   { "class ExecutableModuleData {" }`
                   -Process { "    [string] $" + ($_ -replace "\s.", {[char]::ToUpperInvariant($_.Groups[0].Value[1])}) }`
                   -End     { "}" }

The output is

class ExecutableModuleData {
    [string] $LoadedSymbolImageFile
    [string] $ImagePath
    [string] $ImageName
    [string] $Timestamp
    [string] $CheckSum
    [string] $ImageSize
    [string] $FileVersion
    [string] $ProductVersion
    [string] $FileFlags
    [string] $FileOS
    [string] $FileType
    [string] $FileDate
    [string] $Translations
    [string] $CompanyName
    [string] $ProductName
    [string] $InternalName
    [string] $OriginalFilename
    [string] $ProductVersion
    [string] $FileVersion
    [string] $FileDescription
    [string] $LegalCopyright
    [string] $Comments
    [string] $LegalTrademarks
    [string] $PrivateBuild
}

It is not complete – I don’t have the fields from the record start, some types are incorrect and when run against some other executables a few other fields may appear.
But it is a very good starting point. And way more fun than typing it 🙂

Note that this example is using a new feature of the -replace operator – to use a ScriptBlock to determine what to replace with – that was added in PowerShell Core 6.1.

Bonus tip #2

A regular expression construct that I often find useful is non-greedy matching.
The example below shows the effect of the ? modifier, that can be used after * (zero or more) and + (one or more)

# greedy matching - match to the last occurrence of the following character (>)
if("<Tag>Text</Tag>" -match '<(.+)>') { $matches }
Name                           Value
----                           -----
1                              Tag>Text</Tag
0                              <Tag>Text</Tag>
# non-greedy matching - match to the first occurrence of the the following character (>)
if("<Tag>Text</Tag>" -match '<(.+?)>') { $matches }
Name                           Value
----                           -----
1                              Tag
0                              <Tag>

See Regex Repeat for more info on how to control pattern repetition.

Summary

In this post, we have looked at how the structure of a switch-based parser could look, and how it can be written so that it works as a part of the pipeline.
We have also looked at a few slightly more complicated regular expressions in some detail.

As we have seen, PowerShell has a plethora of options for parsing text, and most of them revolve around regular expressions.
My personal experience has been that the time I’ve invested in understanding the regex language was well invested.

Hopefully, this gives you a good start with the parsing tasks you have at hand.

Thanks to Jason Shirk, Mathias Jessen and Steve Lee for reviews and feedback.

Staffan Gustafsson, @StaffanGson, github

Staffan works at DICE in Stockholm, Sweden, as a Software Engineer and has been using PowerShell since the first public beta.
He was most seriously pleased when PowerShell was open sourced, and has since contributed bug fixes, new features and performance improvements.
Staffan is a speaker at PSConfEU and is always happy to talk PowerShell.

General Availability of PowerShell Core 6.2

$
0
0

We’re proud to announce that the latest version of PowerShell has been released!

This is the third minor supported release of PowerShell Core, the open-source edition of PowerShell that works on Linux, macOS, and Windows!

Thanks to everyone that made this release possible, including our contributors, users, and anyone who filed issues and submitted feedback.

So How Do I Install It?

For info on installing PowerShell Core 6.2, check our installation docs.

A reminder that PowerShell Core works side-by-side with Windows PowerShell, so you can use both independently of each other.
This means that you can continue to use Windows PowerShell for existing scripts while simultaneously using PowerShell Core for new automation or to explore its new capabilities.

What’s New?

The PowerShell Core 6.2 release is focused primarily on performance improvements, bug fixes, and smaller cmdlet/language enhancements that improve the quality of life for users.
To see a full list of improvements, check out our detailed changelogs on GitHub.

Since the 6.1.0 release (September 2018), we’ve taken over 560 changes for the 6.2 release! That’s almost 4 changes a day (excluding weekends)! Of course, we have to thank our community for providing a significant portion of these improvements. Per our public PowerBI dashboard,
the community is still contributing just over half of all incoming pull requests!

Throughout the development of 6.2, the PowerShell Core team has also been focused on supporting PowerShell Core 6 in Azure Functions (more on this soon!), automating our release process (blog coming!), the v1.18.0 release of PSScriptAnalyzer, the 2.0.0-Preview release of the PowerShell Visual Studio Code extension, and, of course, the PowerShell Core 6.2 release.

Experimental Features

In the 6.1 release, we enabled support for Experimental Features which allow contributors and PowerShell Team members to deliver new features and get feedback before we consider the design complete and to avoid making breaking changes as the design evolves. It’s often easier to provide feedback by experimenting with working code than from reading a specification that describes the user experience.

In the 6.2 release, we have a number of Experimental Features you can try out. We’d love it if you can provide us with feedback on these so we can make improvements, decide whether it’s worth keeping, or promote it out of an experimental state.

At any time you can use Get-ExperimentalFeature to get a list of available experimental features that can be enabled or disabled with Enable/Disable-ExperimentalFeature.

Command Not Found Suggestions

Enable-ExperimentalFeature -Name PSCommandNotFoundSuggestion

This feature will use fuzzy matching to find suggestions of commands or cmdlets you may have meant to type if you made a typo.

PS> Get-Commnd
Get-Commnd : The term 'Get-Commnd' is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:1
+ Get-Commnd
+ ~~~~~~~~~~
+ CategoryInfo          : ObjectNotFound: (Get-Commnd:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException


Suggestion [4,General]: The most similar commands are: Get-Command, Get-Content, Get-Job, Get-Module, Get-Event, Get-Host, Get-Member, Get-Item, Set-Content.

In this example, I mistyped Get-Command and it fuzzy matched to a number of suggestions from most likely to least likely.

Implicit Remoting Batching

Enable-ExperimentalFeature -Name PSImplicitRemotingBatching

When using implicit remoting in a pipeline, PowerShell treats each command in the pipeline independently. This results in objects being serialized and de-serialized between the client
and target system repeatedly over the execution of the pipeline.

With this change, PowerShell analyzes the pipeline and determines if the command is safe to run or the command exists on the target system and is able to execute the entire pipeline remotely and only serialize and de-serialize the results back to the client.

This can result in significant performance gains! A real-world test of Get-Process | Sort-Object over localhost shows a decrease from 10-15 seconds to a 20-30 milliseconds, a speed increase of 300-750x. This should be even faster going over a real network connection, and only requires you to update your client (no changes to the server side are necessary).

Temp Drive

Enable-ExperimentalFeature -Name PSTempDrive

If you’re using PowerShell Core on different operating systems, you’ll discover that the environment variable for finding the temporary directory is different on Windows, macOS, and Linux! With this feature, you will get a PSDrive called Temp: that is automatically mapped to the temporary folder on whichever operating system you are using.

PS> "Hello World!" > Temp:/hello.txt
PS> Get-Content Temp:/hello.txt
Hello World!

Be aware that native file commands (like ls on Linux) are not aware of PSDrives and won’t see this Temp: drive.

Abbreviation Expansion

Enable-ExperimentalFeature -Name PSUseAbbreviationExpansion

PowerShell cmdlets are expected to have descriptive nouns. This can results in long names that can take time to type and make it easier to make typing mistakes. This feature allows you to just type the uppercase characters of the cmdlet and use tab-completion to find a match.

PS> i-arsavsf

If you hit tab, and have the Azure PowerShell Az module installed, it will autocomplete to:

PS> Import-AzRecoveryServicesAsrVaultSettingsFile

Note that this feature is intended to be used interactively so the abbreviated forms of cmdlets won’t work in scripts. This is not intended to be a replacement for aliases.

How can I provide feedback?

As always, you should file issues on GitHub to let us know about any features you’d like added or bugs that you encounter. Additionally, you can join us for the PowerShell Community Call
on the 3rd Thursday of every month.

Being an Open Source project, we value all types of contributions including code, tests, documentations, issues, and discussion.

We have an amazing active community and this release would not have been possible without you!

The Future

We are still working out our plans for the next release. Stay tuned for our roadmap to be published on this blog!

On behalf of the PowerShell Team,

Steve Lee
Principal Software Engineering Manager
PowerShell Team
https://twitter.com/Steve_MSFT

The post General Availability of PowerShell Core 6.2 appeared first on PowerShell.

LiveFyre commenting will no longer be available on the PowerShell Gallery

$
0
0

Commenting on the PowerShell Gallery is provided by LiveFyre–a third-party comment system. LiveFyre is no longer supported by Adobe and therefore we are unable to service issues as they arise. We have gotten reports of authentication failing for Twitter and Microsoft AAD and unfortunately we are unable to bring back those services. As we cannot predict when more issues will occur, and we cannot fix issues as they arise we must depreciate use of LiveFyre on the PowerShell Gallery. As of May 1st 2019 LiveFyre commenting will no longer be available on the PowerShell Gallery. Unfortunately we are unable to migrate comments off of LiveFyre so comment history will be lost.

How will package consumers be able to get support?

The other existing channels for getting support and contacting package owners will still be available on the Gallery. The left pane of the package page is the best place to get support. If you are looking to contact the package owner, select “Contact Owners” on the package page. If you are looking to contact Gallery support use the “Report” button. If the package owner has provided a link to their project site in their module manifest a link to their site is also available in the left pane and can be a good avenue for support. For more information on getting package support please see our documentation.

Questions

We appreciate your understanding as we undergo this transition.
Please direct any questions to sysmith@microsoft.com.

The post LiveFyre commenting will no longer be available on the PowerShell Gallery appeared first on PowerShell.


The PowerShell Gallery is now more Accessible

$
0
0

Over the past few months, the team has been working hard to make the PowerShell Gallery as accessible as possible. This blog details why it matters and what work has been done.

Why making the PowerShell Gallery more accessible was a priority

Accessible products change lives and allow everyone to be included in our product. Accessibility is also a major component of striving toward Microsoft’s mission to “Empower every person and every organization on the planet to achieve more.” Improvements in accessibility mean improvements in usability which makes the experience better for everyone. In doing accessibility testing for the Gallery, for example, we found that it was confusing for users to distinguish between “deleting” and “unlisting” packages. By clearly naming this action in the UI, it makes the process of unlisting a package more clear for all package owners.

The steps taken to make the PowerShell Gallery more accessible

The first part of the process focused on bug generation and resolution. We used scanning technology to ensure that the Gallery alerts and helper texts were configured properly, and were compatible with screen reading technology. We use Keros scanning which is Microsoft’s premier accessibility tool to identify accessibility issues and worked to triage and fix the detected issues.

For the second part of the process, we undertook a scenario-focused accessibility study. For the study, blind or visually impaired IT professionals went through core scenarios for using the Gallery. These scenarios included: finding packages, publishing packages, managing packages, and getting support. The majority of the scenarios focused on searching for packages as we believe this is the primary way customers interact with the Gallery. After the study concluded we reviewed the results and watched recordings of the participants navigating through our scenarios. This process allowed us to focus on improving our lowest performing scenarios by addressing specific usability improvements. After making these improvements we underwent a review by accessibility experts to assure we had high usability and accessibility.

Usability Improvements

  • Screen Reader Compatibility: Screen reader technologies make consuming web content accessible so we underwent thorough review, and improvement, to ensure that the Gallery was providing accurate, consistent, and helpful information to screen readers. Some examples of areas we improved:
    • Accurate Headers
    • Clearly labeled tables
    • Helpful tool tips
    • Labeled graph node points
  • Improved Aria Tags: Accessible Rich Internet Applications (Aria) is a specification that makes web content more accessible by passing helpful information to assistive technologies such as screen readers. We underwent a thorough review, and enhancement, of our Aria tags to make sure they were as helpful as possible. One improvement we made, for example, was an ARIA description explaining how to use tags in the Gallery search bar.
  • Renamed UI elements to be more descriptive: Through our review we noticed we were generating some confusion by labeling the unlist button as “delete” and we worked to fix these types of issues.
  • Filters: We added filters for the operating system to make it easier to find compatible packages.
  • Results description: we made searching for packages more straightforward by displaying the total number of results and pages.
  • Page Scrolling: we made searching for packages easier by adding multi-page scrolling.

Reporting Issues

Our goal is to make the Gallery completely user friendly. If you encounter any issues in the PowerShell Gallery that make it less accessible/usable we would love to hear about it on our GitHub page. Please file an issue letting us know what we can do to make the Gallery even more accessible.

The post The PowerShell Gallery is now more Accessible appeared first on PowerShell.

DSC Resource Kit Release April 2019

$
0
0

We just released the DSC Resource Kit!

This release includes updates to 13 DSC resource modules. In the past 6 weeks, 87 pull requests have been merged and 90 issues have been closed, all thanks to our amazing community!

The modules updated in this release are:

  • CertificateDsc
  • ComputerManagementDsc
  • NetworkingDsc
  • OfficeOnlineServerDsc
  • SecurityPolicyDsc
  • SharePointDsc
  • SqlServerDsc
  • StorageDsc
  • xActiveDirectory
  • xPSDesiredStateConfiguration
  • xSMBShare
  • xWindowsUpdate
  • xWinEventLog

xWebAdministration is also in the pipeline for release as soon as it passes all tests.

For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

Our latest community call for the DSC Resource Kit was last Wednesday, March 27. A recording of the call with be posted on the PowerShell YouTube channel soon. You can join us for the next call at 12PM (Pacific time) on May 8 to ask questions and give feedback about your experience with the DSC Resource Kit.

The next DSC Resource Kit release will be on Wednesday, May 15.

We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

Please see our documentation here for information on the support of these resource modules.

Included in this Release

You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or CHANGELOG.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

Module Name Version Release Notes
CertificateDsc 4.5.0.0
  • Fix example publish to PowerShell Gallery by adding gallery_api environment variable to AppVeyor.yml – fixes Issue 187.
  • CertificateDsc.Common.psm1
    • Exclude assemblies that set DefinedTypes to null instead of an empty array to prevent failures on GetTypes(). This issue occurred with the Microsoft.WindowsAzure.Storage.dll assembly.
ComputerManagementDsc 6.3.0.0
  • Correct PSSA custom rule violations – fixes Issue 209.
  • Correct long example filenames for PowerShellExecutionPolicy examples.
  • Opted into Common Tests “Required Script Analyzer Rules”, “Flagged Script Analyzer Rules”, “New Error-Level Script Analyzer Rules” “Custom Script Analyzer Rules” and “Relative Path Length” – fixes Issue 152.
  • PowerPlan:
    • Added support to specify the desired power plan either as name or guid. Fixes Issue 59
    • Changed the resource so it uses Windows APIs instead of WMI/CIM (Workaround for Server 2012R2 Core, Nano Server, Server 2019 and Windows 10). Fixes Issue 155 and Issue 65
NetworkingDsc 7.1.0.0
  • New Resource: NetAdapterState to enable or disable a network adapter – fixes Issue 365
  • Fix example publish to PowerShell Gallery by adding gallery_api environment variable to AppVeyor.yml – fixes Issue 385.
  • MSFT_Proxy:
    • Fixed ProxyServer, ProxyServerExceptions and AutoConfigURL parameters so that they correctly support strings longer than 255 characters – fixes Issue 378.
OfficeOnlineServerDsc 1.3.0.0
  • Changes to OfficeOnlineServerDsc
    • Added pull request template and issue templates.
  • OfficeOnlineServerInstall
    • Added check to test if the setup file is blocked or not;
    • Added ability to install from a UNC path, by adding server to IE Local Intranet Zone. This will prevent an endless wait caused by security warning;
  • OfficeOnlineServerInstallLanguagePack
    • Added check to test if the setup file is blocked or not;
    • Added ability to install from a UNC path, by adding server to IE Local Intranet Zone. This will prevent an endless wait caused by security warning
SecurityPolicyDsc 2.8.0.0
  • Bug fix – Issue 71 – Issue Added Validation Attributes to AccountPolicy & SecurityOption
  • Bug fix – Network_security_Restrict_NTLM security option names now maps to correct keys. This fix could impact your systems.
  • Updated LICENSE file to match the Microsoft Open Source Team standard. Fixes Issue 108
  • Refactored the SID translation process to not throw a terminating error when called from Test-TargetResource
  • Updated verbose message during the SID translation process to identify the policy where an orphaned SID exists
  • Added the EType “FUTURE” to the security option “Network_security_Configure_encryption_types_allowed_for_Kerberos”
  • Documentation update to include all valid settings for security options and account policies
SharePointDsc 3.3.0.0
  • SharePointDsc generic
    • Implemented workaround for PSSA v1.18 issue. No further impact for the rest of the resources
    • Fixed issue where powershell session was never removed and leaded to memory leak
    • Added readme.md file to Examples folder, which directs users to the Wiki on Github
  • SPAppManagementServiceApp
    • Added ability to create Service App Proxy if this is not present
  • SPConfigWizard
    • Improved logging
  • SPFarm
    • Corrected issue where the resource would try to join a farm, even when the farm was not yet created
    • Fixed issue where an error was thrown when no DeveloperDashboard parameter was specfied
  • SPInstall
    • Added check to unblock setup file if it is blocked because it is coming from a network location. This to prevent endless wait
    • Added ability to install from a UNC path, by adding server to IE Local Intranet Zone. This will prevent an endless wait caused by security warning
  • SPInstallLanguagePack
    • Added check to unblock setup file if it is blocked because it is coming from a network location. This to prevent endless wait
    • Corrected issue with Norwegian language pack not being correctly detected
    • Added ability to install from a UNC path, by adding server to IE Local Intranet Zone. This will prevent an endless wait caused by security warning
  • SPProductUpdate
    • Added ability to install from a UNC path, by adding server to IE Local Intranet Zone. This will prevent an endless wait caused by security warning
    • Major refactor of this resource to remove the dependency on the existence of the farm. This allows the installation of product updates before farm creation.
  • SPSearchContentSource
    • Corrected typo that prevented a correct check for ContinuousCrawl
  • SPSearchServiceApp
    • Added possibility to manage AlertsEnabled setting
  • SPSelfServiceSiteCreation
    • Added new SharePoint 2019 properties
  • SPSitePropertyBag
    • Added new resource
  • SPWebAppThrottlingSettings
    • Fixed issue with ChangeLogRetentionDays not being applied
SqlServerDsc 12.4.0.0
  • Changes to SqlServerDsc
    • Added new resources.
      • SqlRSSetup
    • Added helper module DscResource.Common from the repository DscResource.Template.
      • Moved all helper functions from SqlServerDscHelper.psm1 to DscResource.Common.
      • Renamed Test-SqlDscParameterState to Test-DscParameterState.
      • New-TerminatingError error text for a missing localized message now matches the output even if the “missing localized message” localized message is also missing.
    • Added helper module DscResource.LocalizationHelper from the repository DscResource.Template, this replaces the helper module CommonResourceHelper.psm1.
    • Cleaned up unit tests, mostly around loading cmdlet stubs and loading classes stubs, but also some tests that were using some odd variants.
    • Fix all integration tests according to issue PowerShell/DscResource.Template14.
  • Changes to SqlServerMemory
    • Updated Cim Class to Win32_ComputerSystem (instead of Win32_PhysicalMemory) because the correct memory size was not being detected correctly on Azure VMs (issue 914).
  • Changes to SqlSetup
    • Split integration tests into two jobs, one for running integration tests for SQL Server 2016 and another for running integration test for SQL Server 2017 (issue 858).
    • Localized messages for Master Data Services no longer start and end with single quote.
    • When installing features a verbose message is written if a feature is found to already be installed. It no longer quietly removes the feature from the /FEATURES argument.
    • Cleaned up a bit in the tests, removed excessive piping.
    • Fixed minor typo in examples.
    • A new optional parameter FeatureFlag parameter was added to control breaking changes. Functionality added under a feature flag can be toggled on or off, and could be changed later to be the default. This way we can also make more of the new functionalities the default in the same breaking change release (issue 1105).
    • Added a new way of detecting if the shared feature CONN (Client Tools Connectivity, and SQL Client Connectivity SDK), BC (Client Tools Backwards Compatibility), and SDK (Client Tools SDK) is installed or not. The new functionality is used when the parameter FeatureFlag is set to "DetectionSharedFeatures" (issue 1105).
    • Added a new helper function Get-InstalledSharedFeatures to move out some of the code from the Get-TargetResource to make unit testing easier and faster.
    • Changed the logic of “Build the argument string to be passed to setup” to not quote the value if root directory is specified (issue 1254).
    • Moved some resource specific helper functions to the new helper module DscResource.Common so they can be shared with the new resource SqlRSSetup.
    • Improved verbose messages in Test-TargetResource function to more clearly tell if features are already installed or not.
    • Refactored unit tests for the functions Test-TargetResource and Set-TargetResource to improve testing speed.
    • Modified the Test-TargetResource and Set-TargetResource to not be case-sensitive when comparing feature names. This was handled correctly in real-world scenarios, but failed when running the unit tests (and testing casing).
  • Changes to SqlAGDatabase
    • Fix MatchDatabaseOwner to check for CONTROL SERVER, IMPERSONATE LOGIN, or CONTROL LOGIN permission in addition to IMPERSONATE ANY LOGIN.
    • Update and fix MatchDatabaseOwner help text.
  • Changes to SqlAG
    • Updated documentation on the behaviour of defaults as they only apply when creating a group.
  • Changes to SqlAGReplica
    • AvailabilityMode, BackupPriority, and FailoverMode defaults only apply when creating a replica not when making changes to an existing replica. Explicit parameters will still change existing replicas (issue 1244).
    • ReadOnlyRoutingList now gets updated without throwing an error on the first run (issue 518).
    • Test-Resource fixed to report whether ReadOnlyRoutingList desired state has been reached correctly (issue 1305).
  • Changes to SqlDatabaseDefaultLocation
    • No longer does the Test-TargetResource fail on the second test run when the backup file path was changed, and the path was ending with a backslash (issue 1307).
StorageDsc 4.6.0.0
  • Fix example publish to PowerShell Gallery by adding gallery_api environment variable to AppVeyor.yml – fixes Issue 202.
  • Added “DscResourcesToExport” to manifest to improve information in PowerShell Gallery and removed wildcards from “FunctionsToExport”, “CmdletsToExport”, “VariablesToExport” and “AliasesToExport” – fixes Issue 192.
  • Clean up module manifest to correct Author and Company – fixes Issue 191.
  • Correct unit tests for DiskAccessPath to test exact number of mocks called – fixes Issue 199.
  • Disk:
    • Added minimum timetowate of 3s after new-partition using the while loop. The problem occurs when the partition is created and the format-volume is attempted before the volume has completed. There appears to be no property to determine if the partition is sufficiently ready to format and it will often format as a raw volume when the error occurs – fixes Issue 85.
xActiveDirectory 2.25.0.0
  • Added xADReplicationSiteLink
    • New resource added to facilitate replication between AD sites
  • Updated xADObjectPermissionEntry to use AD: which is more generic when using Get-Acl and Set-Acl than using Microsoft.ActiveDirectory.Management\ActiveDirectory:://RootDSE/
  • Changes to xADComputer
    • Minor clean up of unit tests.
  • Changes to xADUser
    • Added TrustedForDelegation parameter to xADUser to support enabling/disabling Kerberos delegation
    • Minor clean up of unit tests.
  • Added Ensure Read property to xADDomainController to fix Get-TargetResource return bug (issue 155).
    • Updated readme and add release notes
  • Updated xADGroup to support group membership from multiple domains (issue 152). Robert Biddle (@robbiddle) and Jan-Hendrik Peters (@nyanhp)
xPSDesiredStateConfiguration 8.6.0.0
  • Fixes style inconsistencies in PublishModulesAndMofsToPullServer.psm1. issue 530
  • Suppresses forced Verbose output in MSFT_xArchive.EndToEnd.Tests.ps1, MSFT_xDSCWebService.Integration.tests.ps1, MSFT_xPackageResource.Integration.Tests.ps1, MSFT_xRemoteFile.Tests.ps1, MSFT_xUserResource.Integration.Tests.ps1, MSFT_xWindowsProcess.Integration.Tests.ps1, and xFileUpload.Integration.Tests.ps1. issue 514
  • Fixes issue in xGroupResource Integration tests where the tests would fail if the System.DirectoryServices.AccountManagement namespace was not loaded.
  • Tests\Integration\MSFT_xDSCWebService.Integration.tests.ps1:
    • Fixes issue where tests fail if a self signed certificate for DSC does not already exist. issue 581
  • Fixes all instances of the following PSScriptAnalyzer issues:
    • PSUseOutputTypeCorrectly
    • PSAvoidUsingConvertToSecureStringWithPlainText
    • PSPossibleIncorrectComparisonWithNull
    • PSAvoidDefaultValueForMandatoryParameter
    • PSAvoidUsingInvokeExpression
    • PSUseDeclaredVarsMoreThanAssignments
    • PSAvoidGlobalVars
  • xPackage and xMsiPackage
    • Add an ability to ignore a pending reboot if requested by package installation.
  • xRemoteFile
    • Updated MatchSource description in README.md. issue 409
    • Improved layout of MOF file to move description left.
    • Added function help for all functions.
    • Moved New-InvalidDataException to CommonResourceHelper.psm1. issue 544
  • Added full stops to the end of all functions help in CommonResourceHelper.psm1.
  • Added unit tests for New-InvalidArgumentException, New-InvalidDataException and New-InvalidOperationException CommonResourceHelper.psm1 functions.
  • Changes to MSFT_xDSCWebService
    • Fixed issue 528 : Unable to disable selfsigned certificates using AcceptSelfSignedCertificates=$false
    • Fixed issue 460 : Redeploy DSC Pull Server fails with error
  • Opt-in to the following Meta tests:
    • Common Tests – Custom Script Analyzer Rules
    • Common Tests – Flagged Script Analyzer Rules
    • Common Tests – New Error-Level Script Analyzer Rules
    • Common Tests – Relative Path Length
    • Common Tests – Required Script Analyzer Rules
    • Common Tests – Validate Markdown Links
  • Add .markdownlint.json file using settings from here as a starting point.
  • Changes to Tests\Unit\MSFT_xMsiPackage.Tests.ps1
    • Fixes issue where tests fail if executed from a drive other than C:. issue 573
  • Changes to Tests\Integration\xWindowsOptionalFeatureSet.Integration.Tests.ps1
    • Fixes issue where tests fail if a Windows Optional Feature that is expected to be disabled has a feature state of “DisabledWithPayloadRemoved”. issue 586
  • Changes to Tests\Unit\MSFT_xPackageResource.Tests.ps1
    • Fixes issue where tests fail if run from a folder that contains spaces. issue 580
  • Changes to test helper Enter-DscResourceTestEnvironment so that it only updates DSCResource.Tests when it is longer than 60 minutes since it was last pulled. This is to improve performance of test execution and reduce the likelihood of connectivity issues caused by inability to pull DSCResource.Tests. issue 505
  • Updated CommonTestHelper.psm1 to resolve style guideline violations.
  • Adds helper functions for use when creating test administrator user accounts, and updates the following tests to use credentials created with these functions:
    • MSFT_xScriptResource.Integration.Tests.ps1
    • MSFT_xServiceResource.Integration.Tests.ps1
    • MSFT_xWindowsProcess.Integration.Tests.ps1
    • xServiceSet.Integration.Tests.ps1
  • Fixes the following issues:
xSMBShare 2.2.0.0
  • Improved Code logic & cosmetic changes
  • Update appveyor.yml to use the default template.
  • Added default template files .codecov.yml, .gitattributes, and .gitignore, and .vscode folder.
  • Changes to xSmbShare
xWindowsUpdate 2.8.0.0
  • xWindowsUpdateAgent: Fixed verbose statement returning incorrect variable
  • Tests no longer fail on Assert-VerifiableMocks, these are now renamed to Assert-VerifiableMock (breaking change in Pester v4).
  • README.md has been updated with correct description of the resources (issue 58).
  • Updated appveyor.yml to use the correct parameters to call the test framework.
  • Update appveyor.yml to use the default template.
  • Added default template files .gitattributes, and .gitignore, and .vscode folder.
xWinEventLog 1.3.0.0
  • THIS MODULE HAS BEEN DEPRECATED. It will no longer be released. Please use the “WinEventLog” resource in ComputerManagementDsc instead.
  • Update appveyor.yml to use the default template.
  • Added default template files .codecov.yml, .gitattributes, and .gitignore, and .vscode folder.

How to Find Released DSC Resource Modules

To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

Of course, you can also always use PowerShellGet (available starting in WMF 5.0) to find modules with DSC Resources:

#To list all modules that tagged as DSCResourceKit
Find-Module -Tag DSCResourceKit 
#To list all DSC resources from all sources
Find-DscResource

Please note only those modules released by the PowerShell Team are currently considered part of the ‘DSC Resource Kit’ regardless of the presence of the ‘DSC Resource Kit’ tag in the PowerShell Gallery.

To find a specific module, go directly to its URL on the PowerShell Gallery:
http://www.powershellgallery.com/packages/< module name >
For example:
http://www.powershellgallery.com/packages/xWebAdministration

How to Install DSC Resource Modules From the PowerShell Gallery

We recommend that you use PowerShellGet to install DSC resource modules:

Install-Module -Name < module name >

For example:

Install-Module -Name xWebAdministration

To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

Update-Module

After installing modules, you can discover all DSC resources available to your local system with this command:

Get-DscResource

How to Find DSC Resource Modules on GitHub

All resource modules in the DSC Resource Kit are available open-source on GitHub.
You can see the most recent state of a resource module by visiting its GitHub page at:
https://github.com/PowerShell/< module name >
For example, for the CertificateDsc module, go to:
https://github.com/PowerShell/CertificateDsc.

All DSC modules are also listed as submodules of the DscResources repository in the DscResources folder and the xDscResources folder.

How to Contribute

You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
See our contributing guide for more info on how to become a DSC Resource Kit contributor.

If you would like to help, please take a look at the list of open issues for the DscResources repository.
You can also check issues for specific resource modules by going to:
https://github.com/PowerShell/< module name >/issues
For example:
https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

Your help in developing the DSC Resource Kit is invaluable to us!

Questions, comments?

If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

Katie Kragenbrink
Software Engineer
PowerShell DSC Team
@katiedsc (Twitter)
@kwirkykat (GitHub)

The post DSC Resource Kit Release April 2019 appeared first on PowerShell.

The Next Release of PowerShell – PowerShell 7

$
0
0

Recently, the PowerShell Team shipped the Generally Available (GA) release of PowerShell Core 6.2. Since that release, we’ve already begun work on the next iteration!

We’re calling the next release PowerShell 7, the reasons for which will be explained in this blog post.

Why 7 and not 6.3?

PowerShell Core usage has grown significantly in the last two years. In particular, the bulk of our growth has come from Linux usage, an encouraging statistic given our investment in making PowerShell viable cross-platform.  This chart represents the number of times pwsh.exe (or just pwsh on Linux/macOS) was started (unless telemetry was disabled).

image

However, we also can clearly see that our Windows usage has not been growing as significantly, surprising given that PowerShell was popularized on the Windows platform. We believe that this could be occurring because existing Windows PowerShell users have existing automation that is incompatible with PowerShell Core because of unsupported modules, assemblies, and APIs. These folks are unable to take advantage of PowerShell Core’s new features, increased performance, and bug fixes. To address this, we are renewing our efforts towards a full replacement of Windows PowerShell 5.1 with our next release.

This means that Windows PowerShell and PowerShell Core users will be able to use the same version of PowerShell to automate across Windows, Linux, and macOS and on Windows, and PowerShell 7 users will have a very high level of compatibility with Windows PowerShell modules they rely on today.

We’re also going to take the opportunity to simplify our references to PowerShell in documentation and product pages, dropping the “Core” in “PowerShell 7”. The PSEdition will still reflect Core, but this will only be a technical distinction in APIs and documentation where appropriate.

Note that the major version does not imply that we will be making significant breaking changes. While we took the opportunity to make some breaking changes in 6.0, many of those were compromises to ensure our compatibility on non-Windows platforms. Prior to that, Windows PowerShell historically updated its major version based on new versions of Windows rather than Semantic Versioning

.NET Core 3.0

PowerShell Core 6.1 brought compatibility with many built-in Windows PowerShell modules, and our estimation is that PowerShell 7 can attain compatibility with 90+% of the inbox Windows PowerShell modules by leveraging changes in .NET Core 3.0 that bring back many APIs required by modules built on .NET Framework so that they work with .NET Core runtime. For example, we expect Out-GridView to come back (for Windows only, though)!

A significant effort for PowerShell 7 is porting the PowerShell Core 6 code base to .NET Core 3.0 and also working with Windows partner teams to validate their modules against PowerShell 7.

Support Lifecycle Changes

Currently, PowerShell Core is under the Microsoft Modern Lifecycle Policy. This means that PowerShell Core 6 is fix-forward: we produce servicing releases for security fixes and critical bug fixes,
and you must install the latest stable version within 6 months of a new minor version release.

In PowerShell 7, we will align more closely with the .NET Core support lifecycle, enabling PowerShell 7 to have both LTS (Long Term Servicing) and non-LTS releases.

We will still have monthly Preview releases to get feedback early.

When do I get PowerShell 7?

The first Preview release of PowerShell 7 will likely be in May. Be aware, however, that this depends on completing integration and validation of PowerShell with .NET Core 3.0.

Since PowerShell 7 is aligned with the .NET Core timeline, we expect the generally available (GA) release to be some time after the GA of .NET Core 3.0.

What about shipping in Windows?

We are planning on eventually shipping PowerShell 7 in Windows as a side-by-side feature with Windows PowerShell 5.1, but we still need to work out some of the details on how you will manage this inbox version of PowerShell 7.

And since the .NET Core timeline doesn’t align with the Windows timeline, we can’t say right now when it will show up in a future version of Windows 10 or Windows Server.

What other features will be in PowerShell 7?

We haven’t closed on our feature planning yet, but expect another blog post relatively soon with a roadmap of our current feature level plans for PowerShell 7.

Steve Lee
https://twitter.com/Steve_MSFT
Principal Engineering Manager
PowerShell Team

The post The Next Release of PowerShell – PowerShell 7 appeared first on PowerShell.

PowerShell Core Release Improvements

$
0
0

Overview

For PowerShell Core, we basically had to build a new engineering system to build and release it. How we build it has evolved over time as we learn and our other teams have implemented features that make some tasks easier. We are finally at a state that we believe we can engineer a system that builds PowerShell Core for release with as little human interaction as necessary.

Current state

Before the changes described here, we had one build per platform. After the binaries were built, they had to be tested and then packaged into the various packages for release. This is done in a private Azure DevOps Pipelines instance. In this state, it took a good deal of people’s time to do a PowerShell Core release. Before these changes, it would take 3-4 people about a week to release PowerShell Core. During this time, the percentage of time people were focused on the release probably averaged 50%.

Goals

  1. Remain compliant with Microsoft and external standards we are required to follow.
  2. Automate as much of the build, test, and release process as possible.
    • This should significantly reduce the amount of human toil needed in each release.
  3. Hopefully, provide some tools or practices others can follow.

What we have done so far

  1. We ported our CI tests to Azure DevOps Pipelines.
    • We have used this in a release and we see that this allowed us to run at least those test in our private Azure DevOps Pipelines instance.
    • This saves us 2-4 man hours per release and a day or more of calendar time if all goes well.
  2. We have moved our release build definitions to YAML.
    • We have used this in a release and we see that this allows us to treat the release build as code and iterate more quickly.
    • This saves us 1-2 man hours per release, when we have done everything correctly.
  3. I have begun to merge the the different platform builds into one combined build.
    • We have not yet used this in a release but we believe this should allow us to have a single button that gets us ready to test.
    • This has not been in use long enough to determine how much time it will save.
  4. We have begun to automate our releases testing. Our release testing is very similar to our CI testing just across more distributions and versions of Windows. We plan to be able to run this through Azure DevOps Pipelines as well.
    • This has not been in use long enough to determine how much time it will save.
  5. We have automated the generation of the draft of the change log and categorizing the entries based on labels the maintainers apply to the PRs. After generation, the maintainers still need to review the change descriptions to make sure it makes sense in the change log.
    • This saves us 2-4 man hours per release.

Summary of improvements

After all these changes, we can now release with 2-3 people in 2 to 3 days, with an average of 25% time focusing on the release.

Details of the combined build

Azure DevOps Pipelines allows us to define complex build pipeline. The build will be complex but things like templates in Azure DevOps makes breaking it into a manageable pieces.

Although this design does not technically reduce the number of parts, one significant thing it does for us it put all of our artifacts, in one place. Having the artifacts in one place, reduces the input to the steps in the rest of the build such as test and release.

I’m not going to discuss it much, but in order to coordinate this work we are keeping diagram of the build. I’ll include it here. If you want me to post another blog on the details, please leave a comment.

diagram

What is left to do

  1. We still have to add the other various NuGet package build steps to the coordinated build.
  2. We need to automate functionality (CI tests) across a representative sample of supported platforms.
  3. It would be nice if we could enforce in GitHub the process that helps us automate the change log generation.
  4. We need to automate the release process including:
    • Automating package testing. For example, MSI, Zip, Deb, RPM, and Snap.
    • Automating the actual release to GitHub, mcr.microsoft.com, packages.microsoft.com and the Snap store.

Travis Plunk
Senior Software Engineer
PowerShell Team

The post PowerShell Core Release Improvements appeared first on PowerShell.

Using PSScriptAnalyzer to check PowerShell version compatibility

$
0
0

PSScriptAnalyzer version 1.18 was released recently, and ships with powerful new rules that can check PowerShell scripts for incompatibilities with other PowerShell versions and environments.

In this blog post, the first in a series, we’ll see how to use these new rules to check a script for problems running on PowerShell 3, 5.1 and 6.

Wait, what’s PSScriptAnalyzer?

PSScriptAnalzyer is a module providing static analysis (or linting) and some dynamic analysis (based on the state of your environment) for PowerShell. It’s able to find problems and fix bad habits in PowerShell scripts as you create them, similar to the way the C# compiler will give you warnings and find errors in C# code before it’s executed.

If you use the VSCode PowerShell extension, you might have seen the “green squigglies” and problem reports that PSScriptAnalyzer generates for scripts you author:

Image of PSScriptAnalyzer linting in VSCode with green squigglies

You can install PSScriptAnalyzer to use on your own scripts with:

Install-Module PSScriptAnalyzer -Scope CurrentUser

PSScriptAnalyzer works by running a series of rules on your scripts, each of which independently assesses some issue. For example AvoidUsingCmdletAliases checks that aliases aren’t used in scripts, and MisleadingBackticks checks that backticks at the ends of lines aren’t followed by whitespace.

For more information, see the PSScriptAnalyzer deep dive blog series.

Introducing the compatibility check rules

The new compatibility checking functionality is provided by three new rules:

  • PSUseCompatibleSyntax, which checks whether a syntax used in a script will work in other PowerShell versions.
  • PSUseCompatibleCommands, which checks whether commands used in a script are available in other PowerShell environments.
  • PSUseCompatibleTypes, which checks whether .NET types and static methods/properties are available in other PowerShell environments.

The syntax check rule simply requires a list of PowerShell versions you want to target, and will tell you if a syntax used in your script won’t work in any of those versions.

The command and type checking rules are more sophisticated and rely on profiles (catalogs of commands and types available) from commonly used PowerShell platforms. They require configuration to use these profiles, which we’ll go over below.

For this post, we’ll look at configuring and using PSUseCompatibleSyntax and PSUseCompatibleCommands to check that a script works with different versions of PowerShell. We’ll look at PSUseCompatibleTypes in a later post, although it’s configured very similarly to PSUseCompatibleCommands.

Working example: a small PowerShell script

Imagine we have a small (and contrived) archival script saved to .\archiveScript.ps1:

# Import helper module to get folders to archive
Import-Module -FullyQualifiedName @{ ModuleName = ArchiveHelper; ModuleVersion = '1.1' }

$paths = Get-FoldersToArchive -RootPath 'C:\Documents\DocumentsToArchive\'
$archiveBasePath = '\\ArchiveServer\DocumentArchive\'

# Dictionary to collect hashes
$hashes = [System.Collections.Generic.Dictionary[string, string]]::new()
foreach ($path in $paths)
{
    # Get the hash of the file and turn it into a base64 string
    $hash = (Get-FileHash -LiteralPath $path).Hash

    # Add this file to the hash catalog
    $hashes[$hash] = $path

    # Now give the archive a unique name and zip it up
    $name = Split-Path -LeafBase $path
    Compress-Archive -LiteralPath $path -DestinationPath (Join-Path $archiveBasePath "$name-$hash.zip")
}

# Add the hash catalog to the archive directory
ConvertTo-Json $hashes | Out-File -LiteralPath (Join-Path $archiveBasePath "catalog.json") -NoNewline

This script was written in PowerShell 6.2, and we’ve tested that it works there. But we also want to run it on other machines, some of which run PowerShell 5.1 and some of which run PowerShell 3.0.

Ideally we will test it on those other platforms, but it would be nice if we could try to iron out as many bugs as possible ahead of time.

Checking syntax with PSUseCompatibleSyntax

The first and easiest rule to apply is PSUseCompatibleSyntax. We’re going to create some settings for PSScriptAnalyzer to enable the rule, and then run analysis on our script to get back any diagnostics about compatibility.

Running PSScriptAnalyzer is straightforward. It comes as a PowerShell module, so once it’s installed on your module path you just invoke it on your file with Invoke-ScriptAnalyzer, like this:

Invoke-ScriptAnalyzer -Path '.\archiveScript.ps1`

A very simple invocation like this one will run PSScriptAnalyzer using its default rules and configurations on the script you point it to.

However, because they require more targeted configuration, the compatibility rules are not enabled by default. Instead, we need to supply some settings to run the syntax check rule. In particular, PSUseCompatibleSyntax requires a list of the PowerShell versions you are targeting with your script.

$settings = @{
    Rules = @{
        PSUseCompatibleSyntax = @{
            # This turns the rule on (setting it to false will turn it off)
            Enable = $true

            # List the targeted versions of PowerShell here
            TargetVersions = @(
                '3.0',
                '5.1',
                '6.2'
            )
        }
    }
}

Invoke-ScriptAnalyzer -Path .\archiveScript.ps1 -Settings $settings

Running this will present us with the following output:

RuleName                            Severity     ScriptName Line  Message
--------                            --------     ---------- ----  -------
PSUseCompatibleSyntax               Warning      archiveScr 8     The constructor syntax
                                                 ipt.ps1          '[System.Collections.Generic.Dictionary[string,
                                                                  string]]::new()' is not available by default in
                                                                  PowerShell versions 3,4

This is telling us that the [dictionary[string, string]]::new() syntax we used won’t work in PowerShell 3. Better than that, in this case the rule has actually proposed a fix:

$diagnostics = Invoke-ScriptAnalyzer -Path .\archiveScript.ps1 -Settings $settings
$diagnostics[0].SuggestedCorrections

File              : C:\Users\roholt\Documents\Dev\sandbox\VersionedScript\archiveScript.ps1
Description       : Use the 'New-Object @($arg1, $arg2, ...)' syntax instead for compatibility with PowerShell versions 3,4
StartLineNumber   : 8
StartColumnNumber : 11
EndLineNumber     : 8
EndColumnNumber   : 73
Text              : New-Object 'System.Collections.Generic.Dictionary[string,string]'
Lines             : {New-Object 'System.Collections.Generic.Dictionary[string,string]'}
Start             : Microsoft.Windows.PowerShell.ScriptAnalyzer.Position
End               : Microsoft.Windows.PowerShell.ScriptAnalyzer.Position

The suggested correction is to use New-Object instead. The way this is suggested might seem slightly unhelpful here with all the position information, but we’ll see later why this is useful.

This dictionary example is a bit artificial of course (since a hashtable would come more naturally), but having a spanner thrown into the works in PowerShell 3 or 4 because of a ::new() is not uncommon. The PSUseCompatibleSyntax rule will also warn you about classes, workflows and using statements depending on the versions of PowerShell you’re authoring for.

We’re not going to make the suggested change just yet, since there’s more to show you first.

Checking command usage with PSUseCompatibleCommands

We now want to check the commands. Because command compatibility is a bit more complicated than syntax (since the availability of commands depends on more than what version of PowerShell is being run), we have to target profiles instead.

Profiles are catalogs of information taken from stock machines running common PowerShell environments. The ones shipped in PSScriptAnalyzer can’t always match your working environment perfectly, but they come pretty close (there’s also a way to generate your own profile, detailed in a later blog post). In our case, we’re trying to target PowerShell 3.0, PowerShell 5.1 and PowerShell 6.2 on Windows. We have the first two profiles, but in the last case we’ll need to target 6.1 instead. These targets are very close, so warnings will still be pertinent to using PowerShell 6.2. Later when a 6.2 profile is made available, we’ll be able to switch over to that.

We need to look under the PSUseCompatibleCommands documentation for a list of profiles available by default. For our desired targets we pick:

  • PowerShell 6.1 on Windows Server 2019 (win-8_x64_10.0.17763.0_6.1.3_x64_4.0.30319.42000_core)
  • PowerShell 5.1 on Windows Server 2019 (win-8_x64_10.0.17763.0_5.1.17763.316_x64_4.0.30319.42000_framework)
  • PowerShell 3.0 on Windows Server 2012 (win-8_x64_6.2.9200.0_3.0_x64_4.0.30319.42000_framework)

The long names on the right are canonical profile identifiers, which we use in the settings:

$settings = @{
    Rules = @{
        PSUseCompatibleCommands = @{
            # Turns the rule on
            Enable = $true

            # Lists the PowerShell platforms we want to check compatibility with
            TargetProfiles = @(
                'win-8_x64_10.0.17763.0_6.1.3_x64_4.0.30319.42000_core',
                'win-8_x64_10.0.17763.0_5.1.17763.316_x64_4.0.30319.42000_framework',
                'win-8_x64_6.2.9200.0_3.0_x64_4.0.30319.42000_framework'
            )
        }
    }
}

Invoke-ScriptAnalyzer -Path ./archiveScript.ps1 -Settings $settings

There might be a delay the first time you execute this because the rules have to load the catalogs into a cache. Each catalog of a PowerShell platform contains details of all the modules and .NET assemblies available to PowerShell on that platform, which can be as many as 1700 commands with 15,000 parameters and 100 assemblies with 10,000 types. But once it’s loaded, further compatibility analysis will be fast. We get output like this:

RuleName                            Severity     ScriptName Line  Message
--------                            --------     ---------- ----  -------
PSUseCompatibleCommands             Warning      archiveScr 2     The parameter 'FullyQualifiedName' is not available for
                                                 ipt.ps1          command 'Import-Module' by default in PowerShell version
                                                                  '3.0' on platform 'Microsoft Windows Server 2012
                                                                  Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 12    The command 'Get-FileHash' is not available by default in
                                                 ipt.ps1          PowerShell version '3.0' on platform 'Microsoft Windows
                                                                  Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 18    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version
                                                                  '5.1.17763.316' on platform 'Microsoft Windows Server
                                                                  2019 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 18    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 19    The command 'Compress-Archive' is not available by
                                                 ipt.ps1          default in PowerShell version '3.0' on platform
                                                                  'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 23    The parameter 'NoNewline' is not available for command
                                                 ipt.ps1          'Out-File' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'

This is telling us that:

  • Import-Module doesn’t support -FullyQualifiedName in PowerShell 3.0;
  • Get-FileHash doesn’t exist in PowerShell 3.0;
  • Split-Path doesn’t have -LeafBase in PowerShell 5.1 or PowerShell 3.0;
  • Compress-Archive isn’t available in PowerShell 3.0, and;
  • Out-File doesn’t support -NoNewline in PowerShell 3.0

One thing you’ll notice is that the Get-FoldersToArchive function is not being warned about. This is because the compatibility rules are designed to ignore user-provided commands; a command will only be marked as incompatible if it’s present in some profile and not in one of your targets.

Again, we can change the script to fix these warnings, but before we do, I want to show you how to make this a more continuous experience; as you change your script, you want to know if the changes you make break compatibility, and that’s easy to do with the steps below.

Using a settings file for repeated invocation

The first thing we want is to make the PSScriptAnalyzer invocation more automated and reproducible. A nice step toward this is taking the settings hashtable we made and turning it into a declarative data file, separating out the “what” from the “how”.

PSScriptAnalyzer will accept a path to a PSD1 in the -Settings parameter, so all we need to do is turn our hashtable into a PSD1 file, which we’ll make ./PSScriptAnalyzerSettings.psd1. Notice we can merge the settings for both PSUseCompatibleSyntax and PSUseCompatibleCommands:

# PSScriptAnalyzerSettings.psd1
# Settings for PSScriptAnalyzer invocation.
@{
    Rules = @{
        PSUseCompatibleCommands = @{
            # Turns the rule on
            Enable = $true

            # Lists the PowerShell platforms we want to check compatibility with
            TargetProfiles = @(
                'win-8_x64_10.0.17763.0_6.1.3_x64_4.0.30319.42000_core',
                'win-8_x64_10.0.17763.0_5.1.17763.316_x64_4.0.30319.42000_framework',
                'win-8_x64_6.2.9200.0_3.0_x64_4.0.30319.42000_framework'
            )
        }
        PSUseCompatibleSyntax = @{
            # This turns the rule on (setting it to false will turn it off)
            Enable = $true

            # Simply list the targeted versions of PowerShell here
            TargetVersions = @(
                '3.0',
                '5.1',
                '6.2'
            )
        }
    }
}

Now we can run the PSScriptAnalyzer again on the script using the settings file:

Invoke-ScriptAnalyzer -Path ./archiveScript.ps1 -Settings ./PSScriptAnalyzerSettings.psd1

This gives the output:

RuleName                            Severity     ScriptName Line  Message
--------                            --------     ---------- ----  -------
PSUseCompatibleCommands             Warning      archiveScr 1     The parameter 'FullyQualifiedName' is not available for
                                                 ipt.ps1          command 'Import-Module' by default in PowerShell version
                                                                  '3.0' on platform 'Microsoft Windows Server 2012
                                                                  Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 9     The command 'Get-FileHash' is not available by default in
                                                 ipt.ps1          PowerShell version '3.0' on platform 'Microsoft Windows
                                                                  Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 12    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 12    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version
                                                                  '5.1.17763.316' on platform 'Microsoft Windows Server
                                                                  2019 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 13    The command 'Compress-Archive' is not available by
                                                 ipt.ps1          default in PowerShell version '3.0' on platform
                                                                  'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 16    The parameter 'NoNewline' is not available for command
                                                 ipt.ps1          'Out-File' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleSyntax               Warning      archiveScr 6     The constructor syntax
                                                 ipt.ps1          '[System.Collections.Generic.Dictionary[string,
                                                                  string]]::new()' is not available by default in
                                                                  PowerShell versions 3,4

Now we don’t depend on any variables anymore, and have a separate spefication of the analysis you want. Using this, you could put this into continuous integration environments for example to check that changes in scripts don’t break compatibility.

But what we really want is to know that PowerShell scripts stay compatible as you edit them. That’s what the settings file is building to, and also where it’s easiest to make the changes you need to make your script compatible. For that, we want to integrate with the VSCode PowerShell extension.

Integrating with VSCode for on-the-fly compatibility checking

As explained at the start of this post, VSCode PowerShell extension has builtin support for PSScriptAnalyzer. In fact, as of version 1.12.0, the PowerShell extension ships with PSScriptAnalyzer 1.18, meaning you don’t need to do anything other than create a settings file to do compatibility analysis.

We already have our settings file ready to go from the last step, so all we have to do is point the PowerShell extension to the file in the VSCode settings.

You can open the settings with Ctrl+, (use Cmd instead of Ctrl on macOS). In the Settings view, we want PowerShell > Script Analysis: Settings Path. In the settings.json view this is "powershell.scriptAnalysis.settingsPath". Entering a relative path here will find a settings file in our workspace, so we just put ./PSScriptAnalyzerSettings.psd1:

VSCode settings GUI with PSScriptAnalyzer settings path configured to "./PSScriptAnalyzerSettings.psd1"

In the settings.json view this will look like:

"powershell.scriptAnalysis.settingsPath": "./PSScriptAnalyzerSettings.psd1"

Now, opening the script in VSCode we see “green squigglies” for compatibility warnings:

VSCode window containing script, with green squigglies underneath incompatible code

In the problems pane, you’ll get a full desrciption of all the incompatibilities:

VSCode problems pane, listing and describing identified compatibility issues

Let’s fix the syntax problem first. If you remember, PSScriptAnalyzer supplies a suggested correction to this problem. VSCode integrates with PSScriptAnalyzer’s suggested corrections and can apply them if you click on the lightbulb or with Ctrl+Space when the region is under the cursor:

VSCode suggesting New-Object instead of ::new() syntax

Applying this change, the script is now:

Import-Module -FullyQualifiedName @{ ModuleName = ArchiveHelper; ModuleVersion = '1.1' }

$paths = Get-FoldersToArchive -RootPath 'C:\Documents\DocumentsToArchive\'
$archivePath = '\\ArchiveServer\DocumentArchive\'

$hashes = New-Object 'System.Collections.Generic.Dictionary[string,string]'
foreach ($path in $paths)
{
    $hash = (Get-FileHash -LiteralPath $path).Hash
    $hashes[$hash] = $path
    $name = Split-Path -LeafBase $path
    Compress-Archive -LiteralPath $path -DestinationPath (Join-Path $archivePath "$name-$hash.zip")
}

ConvertTo-Json $hashes | Out-File -LiteralPath (Join-Path $archivePath "catalog.json") -NoNewline

The other incompatibilities don’t have corrections; for now PSUseCompatibleCommands knows what commands are available on each platform, but not what to substitute with when a command isn’t available. So we just need to apply some PowerShell knowledge:

  • Instead of Import-Module -FullyQualifiedName @{...} we use Import-Module -Name ... -Version ...;
  • Instead of Get-FileHash, we’re going to need to use .NET directly and write a function;
  • Instead of Split-Path -LeafBase, we can use [System.IO.Path]::GetFileNameWithoutExtension();
  • Instead of Compress-Archive we’ll need to use more .NET methods in a function, and;
  • Instead of Out-File -NoNewline we can use New-Item -Value

We end up with something like this (the specific implementation is unimportant, but we have something that will work in all versions):

Import-Module -Name ArchiveHelper -Version '1.1'

function CompatibleGetFileHash
{
    param(
        [string]
        $LiteralPath
    )

    try
    {
        $hashAlg = [System.Security.Cryptography.SHA256]::Create()
        $file = [System.IO.File]::Open($LiteralPath, 'Open', 'Read')
        $file.Position = 0
        $hashBytes = $hashAlg.ComputeHash($file)
        return [System.BitConverter]::ToString($hashBytes).Replace('-', '')
    }
    finally
    {
        $file.Dispose()
        $hashAlg.Dispose()
    }
}

function CompatibleCompressArchive
{
    param(
        [string]
        $LiteralPath,

        [string]
        $DestinationPath
    )

    if ($PSVersion.Major -le 3)
    {
        # PSUseCompatibleTypes identifies that [System.IO.Compression.ZipFile]
        # isn't available by default in PowerShell 3 and we have to do this.
        # We'll cover that rule in the next blog post.
        Add-Type -AssemblyName System.IO.Compression.FileSystem -ErrorAction Ignore
    }

    [System.IO.Compression.ZipFile]::Create(
        $LiteralPath,
        $DestinationPath,
        'Optimal',
        <# includeBaseDirectory #> $true)
}

$paths = Get-FoldersToArchive -RootPath 'C:\Documents\DocumentsToArchive\'
$archivePath = '\\ArchiveServer\DocumentArchive\'

$hashes = New-Object 'System.Collections.Generic.Dictionary[string,string]'
foreach ($path in $paths)
{
    $hash = CompatibleGetFileHash -LiteralPath $path
    $hashes[$hash] = $path
    $name = [System.IO.Path]::GetFileNameWithoutExtension($path)
    CompatibleCompressArchive -LiteralPath $path -DestinationPath (Join-Path $archivePath "$name-$hash.zip")
}

$jsonStr = ConvertTo-Json $hashes
New-Item -Path (Join-Path $archivePath "catalog.json") -Value $jsonStr

You should notice that as you type, VSCode displays new analysis of what you’re writing and the green squigglies drop away. When we’re done we get a clean bill of health for script compatibility:

VSCode window with script and problems pane, with no green squigglies and no problems

This means you’ll now be able to use this script across all the PowerShell versions you need to target. Better, you now have a configuration in your workspace so as you write more scripts, there is continual checking for compatibility. And if your compatibility targets change, all you need to do is change your configuration file in one place to point to your desired targets, at which point you’ll get analysis for your updated target platforms.

Summary

Hopefully in this blog post you got some idea of the new compatibility rules that come with PSScriptAnalyzer 1.18.

We’ve covered how to set up and use the syntax compatibility checking rule, PSUseCompatibleSyntax, and the command checking rule, PSUseCompatibleCommands, both using a hashtable configuration and a settings PSD1 file.

We’ve also looked at using the compatibility rules in with the PowerShell extension for VSCode, where they come by default from version 1.12.0.

If you’ve got the latest release of the PowerShell extension for VSCode (1.12.1), you’ll be able to set your configuration file and instantly get compatibility checking.

In the next blog post, we’ll look at how to use these rules and PSUseCompatibleTypes (which checks if .NET types and static methods are available on target platforms) can be used to help you write scripts that work cross platform across Windows and Linux using both Windows PowerShell and PowerShell Core.


Rob Holt

Software Engineer

PowerShell Team

The post Using PSScriptAnalyzer to check PowerShell version compatibility appeared first on PowerShell.

Viewing all 1519 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>