VMware Cloud Community
mikestevenson
Contributor
Contributor

using PowerCLI with concurrent sessions

I'm currently working on a project using PowerCLI heavily to provision and modify VMs.  If only one thing is happening at a time, then everything works perfectly.  We're running into trouble when we have multiple sessions, though.  Errors get thrown saying that we have modified the global:DefaultVIServer and global:DefaultVIServers system variables.  One or both scripts will error out after this.  We're working around it for the time being by doing some rudimentary file-based locking (i.e. create a lockfile when you start, delete it when you're done), but this is ugly and kludgey.  I'd really like to create a unique session on each connection with one of the SDKs and somehow pass that into my PowerCLI script.  Is this the best way of approaching it?  Is there any way of ensuring concurrency safety using strictly PowerCLI?

Tags (2)
0 Kudos
7 Replies
LucD
Leadership
Leadership

You could have a look at the Start-Job and related cmdlets from PowerShell v2.

There are some things to look out for, but there are some points to watch out for.

The PowerCLI – Using PowerShell Jobs post describes this elaborately.


Blog: lucd.info  Twitter: @LucD22  Co-author PowerCLI Reference

0 Kudos
mikestevenson
Contributor
Contributor

Thanks for the reply, LucD.  I'm not sure if that will work for me or not.  If I understand the job functionality correctly, it allows you to have multiple jobs running from a single script.  What I have here is the same script being executed in multiple processes.  So for instance, I have a built script that goes something like this:

add-PSSnapin VMware.VimAutomation.Core

Connect-VIServer $vcenter

New-VM -Name $serverName -Template $template -RunAsync:$true

disconnect-viserver -server $vcenter

It's a little more complex than that, of course, and start to finish it takes about two minutes to complete.  We're doing this using a separate automation system, so it fires off this script on demand in separate processes.  So if Alice submits a request, and 10 seconds later Bob submits a request, then the PowerCLI calls in one or both of these requests will fail, throwing the global:DefaultVIServer and global:DefaultVIServers system variable errors I mentioned before.

0 Kudos
LucD
Leadership
Leadership

How do you start such a new process ?

Do you start it under the same PowerShell session ?

Or do you start a new PS engine for each process ? I.e. "powershell.exe ....your-script.ps1"

I think the 2nd method, albeit somewaht slower, would work.


Blog: lucd.info  Twitter: @LucD22  Co-author PowerCLI Reference

0 Kudos
mikestevenson
Contributor
Contributor

We're implementing this with HP Operations Orchestration; not sure if you are familiar with the product.  It can run powershell scripts as supplied by the workflow developer.  I admit that I'm not sure how this is implemented behind the scenes.  I watched the process list as the script ran, and I didn't see a new powershell.exe process get spawned (or any other processes for that matter).  I think we could change way that scripts get executed that would create a separate powershell process on each run, but that would be less than ideal from a maintenance standpoint.

0 Kudos
LucD
Leadership
Leadership

Agreed, you should have the script that is fired by OpenView run as short as possible.

In that script you could start a new engine yourself by running "powershell.exe -noninteractive ... myscript.ps1". Perhaps that could work.

Have a look at PowerShell.exe Console Help to see the parameters you cna use with powershell.exe


Blog: lucd.info  Twitter: @LucD22  Co-author PowerCLI Reference

0 Kudos
mikestevenson
Contributor
Contributor

I've done some experimentation over the last couple days, and I think I have a solution.  I made a series of PowerCLI calls (just getting VM info and the like), and ran two concurrent sessions.  As soon as the second one logs in, the first one is no longer able to successfully run anything.  I then tried calling powershell.exe as a separate process, executing the same external script.  In that case, each session was able to execute all of its commands without any errors at all.  I think that we'll be able to doing things this way, though there's a few details still to be fleshed out.  Keeping the script on the filesystem would prevent us from using Operations Orchestration's (OO) version control system, and would also require us to pass in as input parameters any OO variables we want to use (these get resolved at runtime using our current method, so that's not an issue right now).  If we keep the script as an OO variable and pass it in on our command like like powershell -command "& {${script}}" (${script} is the syntax used for variable names in OO), then we could avoid those limitations. However it appears that the code block inside the {} can only be one line with a ; between commands, which makes parsing the code difficult for people.  Both have their drawbacks, but I think the latter is probably preferable.  Is there any way of passing in a multi-line code block from the command line, or can they only be separated by a semicolon?

0 Kudos
LucD
Leadership
Leadership

If you scroll down, on the page for which I gave the link in my previous reply, to the -COmmand parameter, you'll see that you can also pass a code block, but with some limitations ! The text states "You can specify a script block only when running PowerShell.exe in Windows PowerShell".

But that seems to be exactly what you are doing, starting a new PowerShell engine from within a running PowerShell session.


Blog: lucd.info  Twitter: @LucD22  Co-author PowerCLI Reference

0 Kudos