I can't see any reason at all to go out of your way to use SCCM with App Volumes. Just another layer of complexity that isn't needed.
I guess it depends on what you are trying to achieve.
If you have physical desktops and virtual desktops and only want to package applications once I could see a good reason to go for SCCM when installing applications.
Only thing is that you need to make sure to SCCM agent doesn't "bloat" your appstack with unnesesarry information. I would check to see if there are files from the agent included in the appstack and if so make sure to filter them out.
On the other hand. SCCM uses an MSI to distribute applications. You could just as good get the MSI and install it manually just like you would install it with SCCM, then you still have the benefit of SCCM and only "packaging" once but not the downside of the bloatware and management of it in an appstack.
One of the main reasons for using SCCM is to ensure consistency of the build. If there are 100 steps involved in making a "good" image, then the vast majority of these can be mandated and controlled by SCCM, thus ensuring that each step is done; consistently, accurately and in perfect order. Doing that many steps manually each time almost guarantees something will get missed or forgotten.
Another good reason is that steps ordered in SCCM are vastly easier to diagnose, interpret, adjust and review then scripts. This is a generalised statement, many of the people in these forums would argue with that reasoning. But I'm talking from a business ownership and complexity point of view, not from your average VMware engineers point of view. You may be awesome at scripts, but the guy replacing you two years from now might not be. Using scripts carries it's own set of risks that many businesses are not comfortable accepting.
I totally understand the use case, we are looking at doing the same thing but we don't use SCCM for deploying applications, we use a tool called Scense.
I would say go for it. Create 1 appstack and install the application(s) using SCCM. After creating I'd suggest attaching it to a machine without an Appvolumes agent (attach as Non persistent, if you seal the appstack it is read-only) to check and see what's inthere. I think that this will work. For as far as I can see there is no apparent reason for it not to work.
Whilst an interesting point of view, I'm not sure you have answered my actual questions, which, in the context of someone such as yourself that appears to be using SCCM with App Volumes and to build master images for a virtual desktop environment, was:
I'd like to know if many people are using system center configuration manager to build their master images and appstacks? If you are, have you found it works well and does it introduce any of it's own problems?
In response to your other points:
- One of the main reasons for using SCCM is to ensure consistency of the build.
A script would be used for the exact same purpose. Both a scripted build and an SCCM build would deliver similar consistency with the script inherently being unable to do anything other than execute the exact same way every time it is run.
Also, I would assume most people when using SCCM to build master images in a VDI environment are doing so using an environment originally setup for physical machines. This would likely result in the re-use of deployment task sequences originally meant for multiple physical device from many different vendors and which is now being retrofitted to create a single virtual machine master image. This would require the reasonably complex task of removing all of the unnecessary drivers, vendor software, unneccessary deployment tasks and then regression testing to ensure nothing has been broken as a result.
I cannot see the advantage of creating and maintaining a new SCCM Deployment Task Sequence such as this simply to build a single virtual machine. Unlike deploying to thousands of different physical devices from many different vendors we are simply working with a single VM with a known, static (virtual) hardware configuration. A simple script that only completes the required build tasks to create a Virtual Desktop Master image sounds much simpler, quicker and far less complex. One would hope the number of 'steps' taken by a build script for a single machine would be quite limited and that the vast majority of customisations and settings are not being deployed at image build but rather are being managed centrally via Group Policy and UEM.
- If there are 100 steps involved in making a "good" image, then the vast majority of these can be mandated and controlled by SCCM, thus ensuring that each step is done; consistently, accurately and in perfect order.
Again, a script would also achieve the same consistency without the need for an agent, supporting server infrastructure or the additional licensing costs.
- Doing that many steps manually each time almost guarantees something will get missed or forgotten.
Manual builds were not part of my original statement and are not relevent to this discussion.
- Another good reason is that steps ordered in SCCM are vastly easier to diagnose, interpret, adjust and review then scripts.
This is a generalised statement, many of the people in these forums would argue with that reasoning.
A script can be reviewed, interpreted and adjusted extremely quickly. I'd suggest making a change to an SCCM deployment task sequence in an enterprise environment would be a much more difficult process and likely involve multiple parties. A well written and commented script is very easy to review and interpret.
- But I'm talking from a business ownership and complexity point of view, not from your average VMware engineers point of view.
Business ownership' of infrastructure platforms like App Volumes, sccm, the Windows image and even scripts sits with the IT team within any business/enterprise. Upper level management of any business, even an IT business, should not be interested in trivial items such as how an AppStack is sequenced or whether IT has chosen to use sccm, App Volumes or a script to facilitate internal IT processes.
In any case a script be it powershell or something else is inherently less complex and expensive to maintain than an SCCM infrastructure and all of it's associated components.
- You may be awesome at scripts, but the guy replacing you two years from now might not be.
If that is the case then he should not have been hired. Any new employee should be capable of meeting the skillset of the person they are replacing. If not, then they are not actually replacing the role that has been vacated and are in fact filling a more junior role and accomodation should be made for additional external support while this new employee is trained. If it is the quality of the script or documentation that is an issue then it is an error in the hiring of the original employee that was the problem and not a problem with the chosen technology.
The potential poor future hiring practices and decisions of HR should not be relevant when IT chooses a technology. The risk that the business may make a poor hiring decision in the future should not be considered if the technology is the right choice to efficiently and cost effectively deliver on the business requirements, AND sufficient skills exist in the market place to support it into the future. In the long term it will be easier to replace or train an employee with inadequate skills then it will be to recover from a poorly chosen solution. The costs of a cumbersome or inefficient process quickly add up and even talented employees won't be able to compensate for a poor past choice of technology.
- Using scripts carries it's own set of risks that many businesses are not comfortable accepting.
The use of scripts is not something 'the business' should be considering acceptance of. It is the role of the business (I assume by business you mean the CEO/Board/Senior Management Team) to determine their business needs/requirements and then choose the right person/company/product to deliver them for the budget they have set. That person/company/product should be free to utilise any technology that best suits the situation as long as the requirements are met, the budget not exceeded and capacity exists to support it into the future. If the business is being asked to consider whether to 'accept the risk of using scripts' then they have chosen the wrong person/company/product to deliver their solution.
Scripting, and in the case of the solution I work with Powershell scripting, is hardly a risky technology choice or one which skilled resources aren't readily available.
We are exploring this process. Did this method or creating AppStacks from SCCM deployments work well for you?