VMware Cloud Community
srpursch
Contributor
Contributor

Background PowerCLI jobs hang in Server 2012 when output goes to console.

Hello all,

     A little background before I get to my problem.  Our team has 16 VCs in different locations.  I've had a PowerCLI script for a few years that goes and collects VM information from each of them for our support teams.  It includes basic VM configuration information as well as VMDK locations, etc.  The report would connect to each VC in line and collect the info.  The report was starting to take the better part of a day.  I stumbled upon the use of background jobs.  I started a job for each VC and it brought down the time from 7+ hours to about 1.5-2 hours.  This was working fine for some time.

Now for the problem...  I recently started looking at moving our reports from our 2008 R2 server to 2012 server.  This is where I ran into issues.  The jobs would run and collect the info, but then hang and never complete.  After several days of testing, I found that if I outputted anything to the console after connecting to a VC and running another PowerCLI command (ie. get-vm), the job would hang.  I can run PowerCLI commands all day in the background jobs as long as I write the output to a file and not the console.  I can move the same script back to our 2008 R2 server and it works great there.  I've tried PowerCLI 5.0/5.1/5.5 and powershell 3.0(included in 2012 server) as well as powershell 4.0 with the same results.  I've also tried this with just one background job running the commands against the one VC and it still hangs.

Has anyone else come across this and is there any way to tell whether it's a PowerCLI problem or powershell?

I've cut the script down to the bare minimum just for this testing and the jobs still hang.  Here's a sample that hangs for me:

$vclist=@("VC1","VC2")

$scriptblock={

     connect-viserver -credential $cred -server $args

     get-vm|select -first 50

     disconnect-viserver

}

$joblist=@()

foreach ($vc in $vclist){

     $joblist+=start-job -scriptblock $scriptblock -argumentlist $vc -name $vc

     sleep 5

}

This is assuming the PowerCLI module is loaded at the start of each powershell session and $cred variable is already set and availble in the scriptblock, which it is in my case.  After these lines, I would generally use Receive-Job to get the results of Get-VM.

While watching task manager, I can see where the job starts and powershell loads.  Then I can see the CPU/Memory for the job jump while the PowerCLI module loads, connects to the VC and runs Get-VM.  As soon as get-vm runs, the CPU usage for the background job drops to 0% and never moves again.

I have since exported to a CSV and then picked up the CSV from each VC and combined them into one report, so I have a workaround.  But since it works in 2008R2, I'd like to see the same result in 2012 server as well.  Smiley Happy

Thanks in advance for any help you can give on this issue,

Sam

Tags (2)
Reply
0 Kudos
8 Replies
LucD
Leadership
Leadership

Perhaps I'm missing something, but where does the $cred variable gets populated ?

When I leave out the Credential parameter (and rely on SSO), and run your script on a W2K12, it finishes without a problem.


Blog: lucd.info  Twitter: @LucD22  Co-author PowerCLI Reference

Reply
0 Kudos
srpursch
Contributor
Contributor

Thanks for the reply LucD.

Part of my profile has a bunch of functions to make using PowerCLI for the non-scripting members of my team easier.  It allows for local, encrypted storage of a user's credentials among other things.  So $cred is populated automatically when the connect function is used.  I tried modifying the script I posted and eliminated the profile.  Here is what I now have:

$scriptblock={
add-pssnapin VMware.VimAutomation.Core
Connect-VIServer -Server $args
Get-VM|select -first 50|sort Name
Disconnect-viserver
}

$joblist=@()
foreach ($vc in $vclist){
        $joblist+=Start-Job -ScriptBlock $scriptblock -ArgumentList $vc -Name $vc
        sleep 5
}

Once again, the jobs process for a while, then drop to 0%, never to return.

It works fine if I change the scriptblock to read:
$scriptblock={
add-pssnapin VMware.VimAutomation.Core
Connect-VIServer -Server $args
Get-VM|select -first 50|sort Name|export-csv -notypeinformation "d:\test\$($vc).csv"
Disconnect-viserver
}


Thanks,

Sam

Reply
0 Kudos
LucD
Leadership
Leadership

Are you running this against a big environment ?

In other words are we talking of a few VMs, or are there thousands of VMs returned by the Get-VM ?

You might be exhausting the memory per shell. Check the default with

dir wsman:\localhost\Shell\MaxMemoryPerShellMB

You can increase the default setting (1 GB) by doing

Set-Item wsman:\localhost\Shell\MaxMemoryPerShellMB 2048

You'll get a warning about the endpoints. Check the setting with

dir wsman:\localhost\Plugin\microsoft.powershell\Quotas

You can increase that one as well

Set-Item WSMan:\localhost\Plugin\Microsoft.PowerShell\Quotas\MaxMemoryPerShellMB 2048

And then restart the WinRM service with

Restart-Service winrm

It's a wild shot, but does that make a difference ?


Blog: lucd.info  Twitter: @LucD22  Co-author PowerCLI Reference

Reply
0 Kudos
roxton_
Contributor
Contributor

Hi!

I'm having a similar problem using the PowerCLI 5.5

I've tried to execute my scripts on both Windows Server 2008 R2 and 2012.

My scripts are in the attachments to this message.

A little bit of details:

Scripts are used to deploy multiple machines at a time from their templates

All the information about the machines is stored in the environment.xml and user has to edit it first so the scripts could get all necessary information from it.

Execution starts from the vCenterAutodeployment.ps1 which gets the info from the environment.xml and creates jobs for every single machine deployment (you can use Run.bat or Run.ps1 to start).

vCenterFunctions.ps1 contain a logic for a single machine deployment.

Functions.ps1 also contains some logic for after-deployment procedures.

Logging.ps1 creates and fills log files

PROBLEM:

Everything works great until it's time to execute a commandlet which execution time is at least 5 seconds long (e.g. New-VM or Stop-VM, start the exectution and watch the logs). If I don't use Jobs these commandlets work great.

Could someone help me with that?

Reply
0 Kudos
roxton_
Contributor
Contributor

Solution found: Start-Job -RunAs32 -PSVersion 2.0

Do not forget to set the executionPolicy on the 32 bit powershell/powercli

Reply
0 Kudos
LucD
Leadership
Leadership

That is indeed a solution, but don't forget that you will be limited to the PowerShell v2 cmdlets as well in the script that you launch this way.


Blog: lucd.info  Twitter: @LucD22  Co-author PowerCLI Reference

Reply
0 Kudos
srpursch
Contributor
Contributor

Roxton, thanks for a solution.  I've been beating my head against this for some time, just hadn't had time to sit down and figure it out.  I'm testing the -psversion switch on my scripts now.

LucD, if limiting the command set to v2.0 works, does that indicate an issue introduced after v2.0?  Is that something we can take back to Microsoft or VMware to fix?

Reply
0 Kudos
LucD
Leadership
Leadership

I'm not sure where exactly the problem is located, but it has been reported on multiple occasions.

Did you check all the prereqs in the PowerCLI 5.5 Release Notes, especially the .Net versions ?

As you can see in those Release Notes PowerShell v3 and Windows 2012 are supported, so you could open a support call with VMware.


Blog: lucd.info  Twitter: @LucD22  Co-author PowerCLI Reference

Reply
0 Kudos