Exception when deploying appliance ovf

I am trying to deploy an OVF from vRO 7.3.  I followed the example at

but things blow up when doing the readAll() call.  If I point it at a small file, it works fine but if I point it at the ova, it fails.    The code that I am using is:


var ovfManager = vCenter.ovfManager;

System.debug("OVF id is " +;

var ovfparms = new VcOvfParseDescriptorParams();

ovfparms.locale = "";

System.debug("Opening file " + filePath);

var file = new FileReader(filePath);;

System.debug("File open.  Reading data");

var ovfDescriptor = file.readAll();

System.debug("Data read.  composing ovfD");

var ovfD = String(ovfDescriptor);

ovfD = ovfD.trim();


var ovfInfo = ovfManager.parseDescriptor(ovfD, ovfparms);


It never gets to the line after the readAll() call.  If I use that small file I mentioned, it does get to the "data read" line but, of course, blows up when doing the ovf stuff later (it's not a real ova file).

The exception I am getting looks strange as well.


[2018-07-17 11:16:45.483] [D] OVF id is OvfManager

[2018-07-17 11:16:45.485] [D] Opening file /storage/spool/vRAOVA/VMware-vR-Appliance-

[2018-07-17 11:16:45.486] [D] File open.  Reading data

[2018-07-17 11:16:47.715] [E] Workflow execution stack:


', state: 'failed', business state: 'null', exception: 'null'

*** End of execution stack.


Virtually no real content to it.

Any insight would be appreciated.


Carl L.

Tags (2)
0 Kudos
2 Replies

An OVA Is just a tar file with OVF and other data inside. Extract the OVA to its constituent files and try against the OVF instead.

0 Kudos
VMware Employee
VMware Employee

OVA files are binary files. FileReader and related vRO scripting classes are meant to be used with text files. It doesn't make sense to read a binary file as an array of text strings with readAll(); depending on file content you'll get unpredictable results. In general, vRO is not very good at dealing with binary content.

Another problem is that OVA files are big, usually hundreds of megabytes or more, so reading such big file at once might fail due to not enough free heap memory.

0 Kudos