As Microsoft Azure continues to evolve and is ingested more and more by customers, so has the requirement to ensure that more users have access to the services on offer. In a world of ‘least privileged access’ Azure PIM plays an important role in ensuring that access is only delegated as and when as opposed to remaining in place all of the time.
Below we will walk through a basic setup of how we can make PIM work for us!
Wait….. First lets check licencing?
PIM is a great service but its not native to an Azure tenant so you will need the following licence in place (or at least a trial to start with!)
Azure AD Premium P2
Enterprise & Mobility + Security E5 Add-on
Microsoft M365 E5
PIM Walkthrough……
Start by searching for the PIM Service Microsoft Azure
2. Select Azure AD roles
3. Now lets assign eligibility to a role
4. Click Add Assignment
5. Select an Azure AD Role and then assign this to a User or Group for access. Click ‘Next’
6. On the ‘Assignments’ page, configure accordingly. In this example, I am granting 14 days access to elevate to ‘User Administrator’ for a user Peter Martin. Click ‘Assign’
7. We can also amend the role settings for access. In this example we will edit the amount of elevation time from 8hours down to 2 hours
Now lets see what its like for the user signing in
8. Signed in as the user, navigated to the PIM Service and selected ‘My Roles’
9. As we can see, there are two roles we can potentially activate. For the purpose of this article, we will select ‘User Administrator’. Select ‘Activate’
10. As we can see, there is scope to request 2.0 hrs and we are required to input a reason for auditing purposes. Click ‘Activate’ once ready
11. The request will be submitted and upon successful approval, you are required to sign out and back in to Microsoft Azure
12. Signed back in, under my assigned roles we can see the newly assigned role
And that’s it! A nice basic example but one that you can take, plan and subsequently design out a PIM strategy that works best for you and your organisation.
I had a scenario on client site to ensure the ‘Disable Bitlocker’ Action did not run for Virtual Machines. I did not have the MDT Toolkit running ‘In OS’ therefore I could not pull in the ‘IsVM’ variable therefore I wanted to exclude based on retrievable attributes from WMI.
Instead of the normal ‘like’ statements I used the below which you could adapt:
SELECT * FROM Win32_ComputerSystem WHERE NOT Model LIKE “%VMware Virtual Platform%”
SELECT * FROM Win32_ComputerSystem WHERE NOT Model LIKE “%Virtual Machine%”
For me this covered both VMware and Hyper-V virtual machines.
Creating custom UDI pages for a ConfigMgr 2012 R2 Task Sequences is something I do way to infrequent to remember all the details.
One item has that seems less documented is adding pictures, so I thought I would add a quick blog.
In the UDI wizard on the build your own page add a bitmap. Set the property Source to images/filename.bmp
Then copy your selected bmp files to mdt_tookit_package\Tools\x64\Images and mdt_tookit_package\Tools\x86\Images.
Save your changes in the UDI wizard designer and update the MDT tools package in SCCM Console 🙂
Common troubleshooting steps to replicate how ConfigMgr operates prior to testing out deployments is to determine how an applications reacts when running as System.
A useful chink in the armoury here is to use PSTools from the SysInternals Suite.
Launch PSExec from a command prompt and run the following:
Recently I was working for a client on a Surface Pro 3 project with a very clean foundation platform as I like to do for a dynamic build. One disadvantage (which would fill a blog of opinions on its own) is the Office Install which can often take a long time.
I had a scenario whereby during this step my Surface Pro’s would drift off to sleep and until they were woken would simply ‘sit’ during the build process until an action on the keyboard or mouse was performed. They would then carry on.
After a bit of digging, during the OSD process the Balanced Power Scheme is applied as default from Windows 8.1 which includes the 10 minute sleep function. As Office takes longer than this it impacted the devices whether plugged in or not.
To combat this issue I placed a conditioned step in the Task Sequence to remedy the situation:
This will allow the build process to continue uninterrupted and allow the normal GPP Power Settings to apply once you get into Windows for whatever your power policies are.
One new feature brought about from ConfigMgr 2012 SP1 onwards is a Task Sequence variable known as SMSTSPostAction.
This variable is used to store 1 (Yes Just 1) post action task which connects the Task Sequence to the Windows Environment.
The Variable is to be set during the Task Sequence (perhaps near the end).
The scenario I have found it most useful is a simple restart command which will initiate Group Policies to apply (particularly useful for GPP’s) as this acts differently to the ‘Restart’ action initiated by ConfigMgr.
Example of use:
The example I have used it for is simply to restart the machine to allow my Group Policies and GPP’s to apply after the Windows 8.1 Task Sequence has finished to save the support team initiating a restart.
Hope this help and I am sure it can provide a multitude of uses.
A look through a number of my blogs would give the impression I am none to fond of the ConfigMgr 2012 Application model 🙂
Whilst I accept its positives the negatives are also there for all to see.
Now to the point in hand.
If you are attempting to install an Application which by all accounts appears to be installing but the below error in Software Center displays:
This error essentially is ConfigMgr 2012’s way of telling you that although the application installed successfully, it is however unable to validate the detection method successfully resulting in the failure. More evidence is shown in Appenforce.log
To remediate the issue you will need to find a detection method that is acceptable. This mainly affects NON-MSI based products and in my example RSAT for 8.1 and Server 2012 is installed via an MSU so I need to find a better way than my first attempt ( A Registry Key 🙂 )to get this operational.
For this Example I used the Software Distribution path in C:\Windows as my validation but if you imprint your machine after an install this would also be a viable option to pick up the install.
There are many many ways to achieve the above but the scenario where I find this fits in most is with ConfigMgr 2012’s new Application model which while it has its pro’s it certainly has a long way to go in order to completely replace the packages.
The main issue I have seen on client sites has been around their use in a Task Sequence after a reboot. After the restart the machine comes online again and the agent re-establishes a connection. Particularly on machines with an SSD drive, everything happens very quickly and sometimes the build end up failing with a (405) error which is simply the Task Sequence trying to run before it can walk so to speak.
To allow it time to get up to speed a pause works great. The disadvantage of using a script is its yet another package which if it has an issue with not communicating another package will likely hit the same issue.
This is a simple command line step utilizing PING which is right out of the box 🙂
Simply create and amend as you wish but 1 minute is usually more than enough
Since the release of ConfigMgr 2012 global conditions have helped support the Application model in a way that we can present the same application to users however based on criteria limit which deployment type is applied.
This does work well however like everything in its first revision this can be limited at times given limited support about what they are actually for.
Below illustrates a way I have used the global condition to tidy up a very complex deployment that involves deployment out to a machine however can equally be used for users if adapted.
The Scenario:
I am working at a legal client who have a piece of software that uses the same base MSI however a number of MST’s produce the end result (4 in fact). Now I could create 4 different applications for each one which would be linked to a collection via a deployment and work perfectly fine. In this scenario there are a number of components that make up this application (Drivers, Optional Components etc) so I am looking for a way to trim down the amount of Applications/Deployments and keep it tidy whilst utilizing a sohpisticated method.
Problem:
I have 1 MSI with 4 MST’s however the nature of how Applications work is that when AppDiscovery.Log runs, the first Deployment Type which criteria is satisfied this deployment type will run. So how do I have all four together in 1 application in the same way that we could have multiple programs in the same Legacy Packages.
Potential Solution: Global Condition
The golbal conditions are there to determine ‘if’ a deployment type should run as shown below and example for Primary device:
So if we have our flavours of deployment delivered via AD groups, a Global Condition could be used to say ‘if’ machine/user is a member of an AD group it satisfies the requirements!
Great Right?????
The Problem:
At present the official stance coming out of Microsoft are that Global Conditions are not designed for use in this way and that they are a real-time check on if a deployment should take place. Personally I do not see why this should be any different for this but please proceed and do your own due diligience on this first.
The Solution:
This is formed of 2 parts. First we need to create a Global Condition which effectivly runs a script to pull out what we need. In this case, I want a string of all groups a PC is a member of:
Create a Global Condition named accordingly and you want to set this as a Script/Sting and it to be a VB SCript (You could re-write this in Powershell of course!)
Now the script needs to be entered. This can be done if you have a store for Global Condition scripts you use, or simply write directly into the screen:
The Full Script is listed below. You will need to add in your Domain Short Prefix where highlighted:
*********CODE START*****************
‘On Error Resume Next
‘ List Other Groups a Group Belongs To
Set network = WScript.CreateObject( “WScript.Network” )
computername = network.ComputerName
getDNvar = GetDN(computername)
Set objGroup = GetObject(“LDAP://” & getDNvar)
objGroup.GetInfo
arrMembersOf = objGroup.GetEx(“memberOf”)
‘WScript.Echo “MembersOf:”
grplist= “MembersOf:” & vbcr
For Each strMemberOf in arrMembersOf
objGroup=””
‘ WScript.Echo strMemberOf
Set objGroup=GetObject(“LDAP://” & strMemberOf)
grplist = grplist & objGroup.SAMAccountName & vbcr
‘ WScript.Echo objGroup.SAMAccountName
‘ If Err.Number <> 0 Then
‘ Wscript.Echo “No Groups Defined”
‘ End If
‘On Error GoTo 0
Next
Wscript.Echo grplist
Function GetDN(strName)
‘ Constants for the NameTranslate object
Const ADS_NAME_INITTYPE_GC = 3
Const ADS_NAME_TYPE_NT4 = 3
Const ADS_NAME_TYPE_1779 = 1
‘ Use the NameTranslate object to convert the NT user name to the
‘ Distinguished Name required for the LDAP provider
Set objTrans = CreateObject(“NameTranslate”)
‘ Initialize NameTranslate by locating the Global Catalog
objTrans.Init ADS_NAME_INITTYPE_GC, “”
‘ Use the Set method to specify the NT format of the object name
objTrans.Set ADS_NAME_TYPE_NT4, “INPUTHERE\” & strName & “$”
‘ Use the Get method to retrieve the RPC 1779 Distinguished Name
GetDN = objTrans.Get(ADS_NAME_TYPE_1779)
End Function
********* CODE END*****************
Once this has been created (test your script first on your PC 🙂 ) you can now reference this in deployment types as an example below:
Important Note is you MUST use Contains!! This is due to searching the string if the object is part of more than one group. Mine is simply the name of the AD Group the machine is part of.
As I have said above, the ‘official’ statement is that the Global Conditions are not for this purpose but I feel this add a very good ‘chink’ in the Application Armoury as the migration of away from packages occurs.
This morning I was working on an item for a client when I had to amend the IIS Request Filtering options to allow an application to deploy and after amending the config file, instead of simply restarting the site in IIS, autopilot took over and I performed it from the Powershell cmdlt (IISRESET).
Anyway fast forward 1hr and when trying to view a Web report I was getting an authentication screen popup which strangly was working fine on the server using the same account accessing the same link.
Anyway after a bit of digging it turns out the following items had reset themselves and needed to be changed to allow the access denied box to go away:
Within IIS\[Your Web Site]\Reporting Site
Authentication
Ensure that under Windows Authentication > Advanced > Enable Kernel-mode authentication is set
Also under providers (Just to the right on the contect menu)
Ensure the Providers are configured as shown:
Ill make sure I use the website restart next time 🙂