Integrating Azure AD and AWS – Part 4

Integrating Azure AD and AWS – Part 4

We’ve reached the end of the road for my series on integrating Azure Active Directory (Azure AD) and Amazon Web Services (AWS) for single sign-on and role management. In part 1 I walked through the many reasons the integration is worth looking at if your organization is consuming both clouds. In part 2 I described the lab I used to for this series, described the different way application identities (service accounts for those of you in the Microsoft space) are handled in Active Directory Domain Services versus Azure AD, and walked through what a typical application identity looks like in Azure AD. In part 3 I walked through a portion of the configuration steps, did a deep dive into the Azure AD and AWS federation metadata, examined a SAML assertion, and configured the AWS end of the federated trust through the AWS Management Console. This included creation of an identity provider representing the Azure AD tenant and creation of a new IAM role for users within the Azure AD tenant to assert.

In this final post I’ll cover the remainder of the configuration, describe the “provisioning” capabilities of Azure AD in this integration, and pointing out some of the issues with the recommended steps in the Microsoft tutorial.

Before I continue with the configuration, let me cover what I’ve done so far.

  • Part 2
    • Added the AWS application from the Azure AD Application Gallery through the Azure Portal.
  • Part 3
    • Assigned an Azure Active Directory user to the application through the Azure Portal.
    • Configured the Azure AD to pass the Role and RoleSessionName claims through the Azure Portal.
    • Created the SAML identity provider representing Azure AD in the AWS Management Console.
    • Created an AWS IAM Role and associated it with the identity provider representing Azure AD in the AWS Management Console.

At this point JoG users can assert their identity to their heart’s content but we don’t have a list of what AWS IAM roles stored in Azure AD for our users to assert.  So how do we assert a role from Azure AD if the listing of the roles exists in AWS?  The wonderful concept of application programmatic interfaces (APIs) swoops in and saves the day.  Don’t get me wrong, if you hate yourself you can certainly provision them manually by modifying the application manifest file every time a new role is created or deleted.  However, there is an easier route of having Azure AD pick up those roles directly from AWS on an automated schedule.  How does this work?  Well nothing works better than demonstrating how the roles can be queried from the AWS API.

The AWS SDK for .NET makes querying the API incredibly easy.  We’re not stuck worrying about assembling the request and signing it.  As you can see below the script is six lines of code in PowerShell.

Script.png

The result is a listing of the roles configured in AWS which includes the AzureADEC2Admins role I created earlier.  This example demonstrates the power a robust API brings to the table when integrating cloud services.

2

When Microsoft speaks of provisioning in regards to the AWS integration, they are talking about provisioning the roles defined in AWS to the the application manifest file in Azure AD.  This provides us with the ability to assign the roles from within the Azure Portal as we’ll see later.  This differs from many of the Azure AD integrations I’ve observed in the past where it will provision a record for the user into the software as a service (SaaS) offering.  Below is a simple diagram of the provisioning process.3

To do support provisioning we need to navigate to the AWS Management Console, open the Services Menu, and select IAM.  We then select Users and hit the Add User button.  I named the user AzureAD, gave it programmatic access type, and attached the IAMReadOnlyAccess policy.  AWS then presented me with the access key ID and secret access key I’ll need to provide to Azure AD.  Yes, we are going to follow security best practices and provide the account with the minimum rights and permissions it needs to provide the functionality.  The Microsoft tutorial instructs you to generate the credentials under the context of the AWS administrator effectively giving the application full rights to the AWS account.  No Microsoft, just no.

I next bounce back to the Azure portal and to the AWS application configuration.  From here I select the Provisioning option, switch the drop-down box to Automatic, and plug the access key ID into the clientsecret field and the secret access key into the secret token field.  A quick test connection shows success and I then save the configuration.  Note that you must first save the configuration before you can turn on the synchronization.

4

After the screen refreshes I move down to the Settings section and turn the Provisioning Status to On and set the Scope to Sync only assigned users and groups (kind of a moot point for this, but oh well).  I then Save the configuration once again and give it about 10 minutes to pull down the roles.

I then navigate back to the Users and Groups section and edit the Rick Sanchez assignment.  Hitting the role option now shows me the AzureADEC2Admins role I configured in AWS IAM.


5.png

Let’s take another look at the service principle representing the AWS application in PowerShell.  Using the Azure AD PowerShell cmdlets I referenced in entry 2 we connect to Azure AD and run the cmdlet Get-AzureADServicePrincipal which when run shows the manifest has been updated to include the newly synchronized application role.

6

We’ve configured the SAML trust on both ends, defined the necessary attributes, setup synchronization, and assigned Rick Sanchez an IAM role. In a moment we’ll demonstrate all of the pieces coming together.

Before I wrap it up, I want to quickly mention a few issues I ran into with this integration that seemed to resolve themselves without any intervention.

  1. Up to a few nights ago I was unable to get the Provisioning piece working.  I’m not putting it past user error (this is me we’re talking about) but I tried numerous times and failed but was successful a few nights ago.  I also noticed from some recent comments in the Microsoft tutorial people complaining of similar errors.  Maybe something broke for a bit?
  2. The value of the audience attribute in the audienceRestriction section of the SAML assertion generated by Azure AD doesn’t match the identifier within the AWS federation metadata.  Azure AD inserts some garbage looking audience value by default which was causing the assertions to be rejected by AWS.  After setting the identifier to the value of urn:amazon:webservices as referenced in the AWS federation metadata the assertion was consumed without issue.  I saw similar complaints in the Microsoft tutorial so I’m fairly confident this wasn’t just my issue.The story gets a bit stranger.  I wanted to demonstrate the behavior for this series by removing the identifier I had previously added.  Oddly enough the assertion was consumed without issue by AWS.  I verified using Fiddler that the audience value was populated with that garbage entry.  Either way, I would err on the side of caution and would recommend populating the identifier with the entry referenced in the AWS metadata as seen below.7.png

The last thing I want to point out is the Microsoft tutorial states that you are required to create the users in AWS prior to asserting their identity.  This is inaccurate as AWS does not require a user record to be pre-created in AWS.  This is different from a majority (if not all) of the SaaS integrations I’ve done in the past so this surprised me as well.  Either way, it’s not required which is a nice benefit if you’ve ever had to deal with the challenging of managing the identify lifecycle across cloud offerings.

Let’s wrap up this series by having Rick Sanchez log into the AWS Management Console and shutdown an EC2 instance.  Here I have logged into the Windows 10 machine named CLIENT running in Azure.  We navigate to https://myapps.microsoft.com and log into Azure AD as Rick Sanchez.  We then hit the Amazon Web Services icon and are seamless logged into the AWS Management Console.

8.png

Examining the assertion in Fiddler shows  the Role and RoleSessionName claims in the assertion.

9.png

Navigating to the EC2 Dashboard displays the instance I prepared earlier using my primary account.  Rick has full rights over administration of the instance for activities such as starting and starting the instance.  After successfully terminating the instance I log into the AWS Management Console as my primary AWS account and go to CloudTrail and see the log entries recording the activities of Rick Sanchez.

10.png

With that let’s cover some key pieces of information to draw from the series.

  1. The Azure AD and AWS integration differs from most SaaS integrations I’ve done when it comes to user provisioning.  Most of the time a user record must exist prior to the user authenticating.  There are a growing number of SaaS providers provisioning upon successful authentication as provisioning challenges grow to further consumption of cloud services, but they are still few and far between.  AWS does a solid job with eliminating the pain of pre-provisioning users.
  2. The concept of associating roles with specific identity providers is really neat on Amazon’s part.  It allows the customer to manage permissions and associate those permissions with roles in AWS, but delegate the right on a per identity provider basis to assert a specific set of roles.
  3. Microsoft’s definition of provisioning in this integration is pulling a listing of roles from AWS and making them configurable in the Azure Portal.
  4. The AWS API is solid and quite easy to leverage when using the AWS SDKs. I would like to see AWS switch from what seems to be proprietary method of application access to OAuth to become more aligned with the rest of the industry.
  5. Don’t trust vendors to make everything point and click. Take the time to understand what’s going on in the background. In a SAML integration such as this, a quick review of the metadata can save you a lot of headaches when troubleshooting issues.

I learned a ton about AWS over these past few weeks and also got some good deep dive time into Azure AD which I haven’t had time for in a while.  Hopefully you found this series valuable and learned a thing or two yourself.

In my next series I plan on writing a simple application to consume the Cognito service offered by AWS.  For those of you more familiar with the Microsoft side of the fence, it’s similar to Azure AD B2C but with some unique features Microsoft hasn’t put in place yet making a great option to solve those B2C identity woes.

Thanks and have a wonderful holiday!

Deep dive into AD FS and MS WAP – User Certificate Authentication through a WAP

Hi everyone,

Today I continue my series of posts that cover a behind the scenes look at how Active Directory Federation Service (AD FS) and the Microsoft Web Application Proxy (WAP) interact.  In my first post  I explained the business cases that would call for the usage of a WAP.  In my second post I did a deep dive into the WAP registration process (MS refers to this as the trust establishment with AD FS and the WAP).  In this post I decided to cover how user certificate authentication is achieved when AD FS server is placed behind the WAP.

AD FS offers a few different options to authenticate users to the service including Integrated Windows Authentication (IWA), forms-based authentication, and certificate authentication.  Readers who work in environments with sensitive data where assurance of a user’s identity is important should be familiar with certificate authentication in the Microsoft world.  If you’re unfamiliar with it I recommend you take a read through this Microsoft article.

With the recent release of the National Institute of Standards and Technology (NIST) Digital Identity Guidelines 800-63 which reworks the authenticator assurance levels (AAL) and relegates passwords to AAL1 only, organizations will be looking for other authenticator options.  Given the maturity of authenticators that make use of certificates such as the traditional smart card it’s likely many organizations will look at opportunities for how the existing equipment and infrastructure can be further utilized.  So all the more important we understand how AD FS certificate authentication works.

I’ll be using the lab I described in my first post.  I made the following modifications/additions to the lab:

  • Configure Active Directory Certificate Services (AD CS) certificate authority (CA) to include certificate revocation list (CRL) distribution point (CDP).  The CRLs will be served up via an IIS instance with the address crl.journeyofthegeek.com.  This is the only CDP listed in the certificates.  Certificates created during my original lab setup that are installed within the infrastructure do not include a CDP.
  • Added a non-domain-joined Windows 10 computer which be used as the endpoint the test user accesses the federation service from.

Tool-wise I used ProcMon, Fiddler, API Monitor, and WireShark.

So what did I discover?

Prior to doing any type of user interaction, I setup the tools I would be using moving forward.  On the WAP I started ProcMon as an Administrator and configured my filters to capture only TCP Send and TCP Receive operations.  I also setup WireShark using a filter of ip.addr==192.168.100.10 && tcp.port==80.  The IP address is the IP of the web server hosting my CRLs.  This would ensure I’d see the name of the process making the connection to the CDP as well as the conversation between the two nodes.

pic1

** Note that the machine will cache the CRLs after they are successfully downloaded from the CDP.  It will not make any further calls until the CRLs expire.  To get around this behavior while I was testing I ran the command certutil -setreg chain\ChainCacheResyncFiletime @now as outlined in this article.   This forces the machine to pull the CRLs again from the CDP regardless of whether or not they are expired.  I ran the command as the LOCAL SYSTEM security principal using psexec.

The final step was to start Fiddler as the NETWORK SERVICE security principal using the command psexec -i -u “NT AUTHORITY\Network Service” “C:\Program Files (x86)\Fiddler2\Fiddler.exe”.  Remember that Fiddler needs the public key certificate in the appropriate file location as I outlined in my last post.  Recall that the Web Application Proxy Service and the Active Directory Federation Service running on the WAP both run as that security principal.

Once all the tools were in place I logged into the non-domain joined Windows 10 box and opened up Microsoft Edge and popped the username of my test user into the username field.

pic2.png

After home realm discovery occurred within Azure AD, I received the forms-based login page of my AD FS instance.

 

pic3.png

Let’s take a look at what’s happened on the WAP so far.

In the initial HTTP Connect session the WAP makes to the AD FS farm, we see that the ClientHello handshake occurs where the WAP authenticates to the AD FS server to authenticate itself as described in my last post.

pic4.png

Once the secure session is established the WAP passes the HTTP GET request to the AD FS server.  It adds a number of headers to the request which AD FS consumes to identify the client is coming from the WAP.  This information is used for a number of AD FS features such as enforcing additional authentication policies for Extranet access.

pic5.png

The WAP also passes a number of query strings.  There are a few interesting query strings here.  The first is the client-request-id which is a unique identifier for the session that AD FS uses to correlate event log errors with the session.  The username is obvious and shows the user’s user principal name that was inputted in the username field at the O365 login page.  The wa query string shows a value of wsignin1.0 indicating the usage of WS-Federation.  The wtrealm indicates the relying party identifier of the application, in this case Azure AD.

pic6

The wctx query string is quite interesting and needs to be parsed a bit on its own.  Breaking down the value in the parameter we come across three unique parameters.

LoginOptions=3 indicates that the user has not selected the “Keep me signed in” option.  If the user had selected that checkbox a value of 1 would have been passed and AD FS would create a persistent cookie which would exist even after the browser closes.  This option is sometimes preferable for customers when opening documents from SharePoint Online so the user does not have to authenticate over and over.

The estsredirect contains the encoded and signed authentication request from O365.  I stared at API monitor for a few hours going API call by API call trying to identify what this looks like once it’s decoded, but was unsuccessful.  If you know how to decode it, I’d love to know.  I’m very curious as to its contents.

The WAP next makes another HTTP GET to the AD FS server this time including the additional query string of pullStatus which is set equal to 0.  I’m clueless as to the function on of this, I couldn’t find anything.  The only other thing that changes is the referer.

My best guess on the above two sessions is the first session is where AD FS performs home realm discovery and maybe some processing on to determine if there are any special configurations for the WAP such as limited or expanded authentication options (device authN, certAuthN only).  The second session is simply the AD FS server presenting the authentication methods configured for Extranet users.

The user then chooses the “Sign in with an X.509 certificate” (I’m not using SNI to host both forms and cert authN on the same port) and the WAP then performs another HTTP CONNECT to port 49443 which is the certificate authentication endpoint on the AD FS server.  It again authenticates to the AD FS server with its client certificate prior to establishing the secure tunnel.

The third session we see a HTTP POST to the AD FS server with the same query parameters as our previous request but also providing a JSON object with a key of AuthMethod and the key value combination of AuthMethod=CertificateAuthentication in the body.

pic7

The next session is another HTTP POST with the same JSON object content and the key value pairs of AuthMethod=CertificateAuthentication and RetrieveCertificate=1 in the body.  The AD FS server sends a 307 Temporary Redirect to the /adfs/backendproxytls/ endpoint on the AD FS server.

Prior to the redirect completing successful we see the calls to the CDP endpoint for the full and delta CRLs.

pic8.png

pic9

I was curious as to which process was pulling the CRLs and identified it was LSASS.EXE from the ProcMon capture.

pic10

At the /adfs/backendproxytls/ endpoint the WAP performs another HTTP POST this time posting a JSON object with a number of key value combinations.

pic11.png

The interesting key value types included in the JSON object are the nested JSON object for Headers which contains all the WAP headers I covered earlier.  The query string JSON object which contains all the query strings I covered earlier.  The SeralizedClientCertificate contains the certificate the user provided after selecting to use certificate authentication.  The AD FS server then sends back a cookie to the WAP.  This cookie is the cookie the representing the user’s authentication to the AD FS server as detailed in this link.

pic12.png

The WAP then performs a final HTTP GET back at the /adfs/ls/ endpoint including the previously described headers and query strings as well as provided the cookie it just received.  The AD FS server responds by providing the assertion requested by Microsoft along with a MSISAuthenticated, MSISSignOut, and MSISLoopDetectionCookie cookies which are described in the link above.

What did we learn?

  1. The certificate is checked at both the WAP and the AD FS server to ensure it is valid and issued from a trusted certificate authority.  Remember to verify you trust the certificate chain of any user certificates on both the AD FS servers and WAPs.
  2. CRL Revocation checking is enabled by default and is performed on both the AD FS server and the WAP.  Remember to verify the locations in your CDP are available by both devices.
  3. The AD FS servers use the LSALogonUser function in the secur32.dll library to perform standard certificate authentication to Active Directory Domain Services.  I didn’t include this, but I captured this by running API monitor on the AD FS server.

In short, if you’re going to use device authentication or user certificate authentication make sure you have your PKI components in order.

See you next post!

Deep dive into AD FS and MS WAP – WAP Registration

Hi everyone,

In today’s blog entry I’ll be doing a deep dive into how the Microsoft Web Application Proxy (WAP) established a trust with the Active Directory Federation Service (AD FS) (I’ll be referring to this as registration) in order to act as a reverse proxy for AD FS.  In my first entry into this series I covered the business use cases that would call for such an integration as well as providing an overview of the lab environment I’ll be using for the series.  So what does registration mean?  Well, the best way to describe it is to see it in action.

Figuring out how to capture the conversation took some trial and error.  This is where Sysinternals Process Explorer comes into play.  I went through the process of registering the WAP with AD FS using the Remote Access Management Console configuration utility and monitored the running processes with Process Explorer.  Upon reviewing the TCP/IP activity of the Remote Access Management Console process (RAMgmtUI.exe) I observed TCP connectivity to the AD FS farm.

RemoteReg

The process is running as the logged in user, in my case the administrator account I’ve configured.  This meant I would need to run Fiddler using the logged in user context rather than having to do some funky with running it as SYSTEM or another security principal using PSEXEC.

I started up Fiddler and configured it to intercept HTTPS traffic as per the configuration below.  Ensure that you’ve trusted the Fiddler root certificate so Fiddler can establish a man-in-the-middle (MITM) scenario.

fiddlerconfig.png

I next ran the Remote Access Management Console and initiated the Web Application Proxy Configuration wizard.   Here I ran the wizard a few different times specifying invalid credentials on the AD FS server to generate some web requests.  The web conversation below popped up Fiddler.

failedlog.png

Digging into the third session shows an HTTP POST to sts.journeyofthegeek.com/adfs/Proxy/EstablishTrust with a return code of 401 Unauthorized which we would expect given our application doesn’t know if authentication is required yet and didn’t specify an Authorization header.

estab1

Session four shows another HTTP POST to the same URL this time with an Authorization header specifying Basic authentication with our credentials Base64 encoded.  We receive another 401 because we have invalid credentials which again is expected.

succlog.png

What’s interesting is the JSON object being posted to the URL.  The JWT includes a key named SerializedTrustCertificate with a value of a Base64 encoded public-key certificate as the value.

json.png

Copy and pasting the encoded value to notepad and saving the file with a CER extension yields the certificate below of which the WAP has both the public and private key pairs.  The certificate is a 2048-bit key length self-signed certificate.

cert.png

At this point the WAP will attempt numerous connections to the /adfs/Proxy/GetConfiguration URL with a query string of api-version=2 as seen in the screenshot below.  It will receive a 401 back because Fiddler needs a copy of the client certificate to provide to the AD FS server.  At this point I let it time out and eventually the setup finished.

getconfig.png

So what does the configuration information look like from AD FS when it’s successfully retrieved?  So to see that we have to now pay attention to the Microsoft.IdentityServer.ProxyService.exe process which runs as the Active Directory Federation Services service (adfssrv).

adfservice.png

Since the process runs as Network Service I needed to get a bit creative in how I captured the conversation with Fiddler.  The first step is to export the public-key certificate for the self-signed certificate generated by the WAP, name it ClientCertificate.cer, and to store it in the Network Service profile folder in C:\Windows\ServiceProfiles\NetworkService\Documents\Fiddler2.   By doing this Fiddler will use that certificate for any website requiring client certificate authentication.

The next step was to start Fiddler as the Network Service security principal.  To do this I used PSEXEC with the following options:

Psexec -i -u “NT AUTHORITY\Network Service” “C:\Program Files (x86)\Fiddler2\Fiddler.exe.

I then restarted the Active Directory Federation Service on the WAP and boom there are our successful GET from the AD FS server at the /adfs/Proxy/GetConfiguration URL.

getconfigsuc.png

The WAP receives back a JSON object with all the configuration information for the AD FS server as seen below.  Much of this is information about endpoints the AD FS server is supporting.  Beyond that we get information the AD FS service configuration.  The WAP uses this configuration to setup its bindings with the HTTP.SYS kernel mode driver.  Yes the WAP uses HTTP.SYS in the same way AD FS uses it.

config1.png

config2.png

So what did we learn?  When establishing the trust with the AD FS server (I’m branding this registration 🙂 ) the WAP does the following:

  1. Generates a 2048-bit self-signed certificate
  2. Opens an HTTPS connection with an AD FS server
  3. Performs a POST on /adfs/Proxy/EstablishTrust providing a JSON object containing the public key certificate and authenticating to the AD FS server with the credentials provided with the wizard using Basic authentication.If the authentication is successful the AD FS server establishes the trust.  (I’ll dig into this piece in the next post)
  4. Performs a GET on /adfs/Proxy/GetConfiguration using the self-signed certificate to authenticate itself to the AD FS server.
  5. Consumes the configuration information and configures the appropriate endpoints with calls to HTTP.SYS.

So that’s the WAP side of the fence for establishing the trust.  In my next post I’ll briefly cover what goes on with the AD FS server as well as examining the LDAP calls (if any) to AD DS during the registration process.

See you next time!

Azure AD Pass-through Authentication – How does it work? Part 2

Welcome back. Today I will be wrapping up my deep dive into Azure AD Pass-through authentication. If you haven’t already, take a read through part 1 for a background into the feature. Now let’s get to the good stuff.

I used a variety of tools to dig into the feature. Many of you will be familiar with the Sysinternals tools, WireShark, and Fiddler. The Rohitab API Monitor. This tool is extremely fun if you enjoy digging into the weeds of the libraries a process uses, the methods it calls, and the inputs and outputs. It’s a bit buggy, but check it out and give it a go.

As per usual, I built up a small lab in Azure with two Windows Server 2016 servers, one running AD DS and one running Azure AD Connect. When I installed Azure AD Connect I configured it to use pass-through authentication and to not synchronize the password. The selection of this option will the MS Azure Active Directory Application Proxy. A client certificate will also be issued to the server and is stored in the Computer Certificate store.

In order to capture the conversations and the API calls from the MS Azure Active Directory Application Proxy (ApplicationProxyConnectorService.exe) I set the service to run as SYSTEM. I then used PSEXEC to start both Fiddler and the API Monitor as SYSTEM as well. Keep in mind there is mutual authentication occurring during some of these steps between the ApplicationProxyConnectorService.exe and Azure, so the public-key client certificate will need to be copied to the following directories:

  • C:WindowsSysWOW64configsystemprofileDocumentsFiddler2
  • C:WindowsSystem32configsystemprofileDocumentsFiddler2

So with the basics of the configuration outlined, let’s cover what happens when the ApplicationProxyConnectorService.exe process is started.

  1. Using WireShark I observed the following DNS queries looking for an IP in order to connect to an endpoint for the bootstrap process of the MS AAD Application Proxy.DNS Query for TENANT ID.bootstrap.msappproxy.net
    DNS Response with CNAME of cwap-nam1-runtime.msappproxy.net
    DNS Response with CNAME of cwap-nam1-runtime-main-new.trafficmanager.net
    DNS Response with CNAME of cwap-cu-2.cloudapp.net
    DNS Response with A record of an IP
  2. Within Fiddler I observed the MS AAD Application Proxy establishing a connection to TENANT_ID.bootstrap.msappproxy.net over port 8080. It sets up a TLS 1.0 (yes TLS 1.0, tsk tsk Microsoft) session with mutual authentication. The client certificate that is used for authentication of the MS AAD Application Proxy is the certificate I mentioned above.
  3. Fiddler next displayed the MS AAD Application Proxy doing an HTTP POST of the XML content below to the ConnectorBootstrap URI. The key pieces of information provided here are the ConnectorID, MachineName, and SubscriptionID information. My best guess MS consumes this information to determine which URI to redirect the connector to and consumes some of the response information for telemetry purposes.Screen Shot 2017-04-05 at 9.37.04 PM
  4. Fiddler continues to provide details into the bootstrapping process. The MS AAD Application Proxy receives back the XML content provided below and a HTTP 307 Redirect to bootstrap.his.msappproxy.net:8080. My guess here is the process consumes this information to configure itself in preparation for interaction with the Azure Service Bus.Screen Shot 2017-04-05 at 9.37.48 PM
  5. WireShark captured the DNS queries below resolving the IP for the host the process was redirected to in the previous step.DNS Query for bootstrap.his.msappproxy.net
    DNS Response with CNAME of his-nam1-runtime-main.trafficmanager.net
    DNS Response with CNAME of his-eus-1.cloudapp.net
    DNS Response with A record of 104.211.32.215
  6. Back to Fiddler I observed the connection to bootstrap.his.msappproxy.net over port 8080 and setup of a TLS 1.0 session with mutual authentication using the client certificate again. The process does an HTTP POST of the XML content  below to the URI of ConnectorBootstrap?his_su=NAM1. More than likely this his_su variable was determined from the initial bootstrap to the tenant ID endpoint. The key pieces of information here are the ConnectorID, SubscriptionID, and telemetry information.
    Screen-Shot-2017-04-05-at-9.35.52-PM
  7. The next session capture shows the process received back the XML response below. The key pieces of content relevant here are within the SignalingListenerEndpointSettings section.. Interesting pieces of information here are:
    • Name – his-nam1-eus1/TENANTID_CONNECTORID
    • Namespace – his-nam1-eus1
    • ServicePath – TENANTID_UNIQUEIDENTIFIER
    • SharedAccessKey

    This information is used by the MS AAD Application Proxy to establish listeners to two separate service endpoints at the Azure Service Bus. The proxy uses the SharedAccessKeys to authenticate to authenticate to the endpoints. It will eventually use the relay service offered by the Azure Service Bus.

    Screen Shot 2017-04-05 at 9.34.43 PM

  8. WireShark captured the DNS queries below resolving the IP for the service bus endpoint provided above. This query is performed twice in order to set up the two separate tunnels to receive relays.DNS Query for his-nam1-eus1.servicebus.windows.net
    DNS Response with CNAME of ns-sb2-prod-bl3-009.cloudapp.net
    DNS Response with IP

    DNS Query for his-nam1-eus1.servicebus.windows.net
    DNS Response with CNAME of ns-sb2-prod-dm2-009.cloudapp.net
    DNS Response with different IP

  9. The MS AAD Application Proxy establishes connections with the two IPs received from above. These connections are established to port 5671. These two connections establish the MS AAD Application Proxy as a listener service with the Azure Service Bus to consume the relay services.
  10. At this point the MS AAD Application Proxy has connected to the Azure Service Bus to the his-nam1-cus1 namespace as a listener and is in the listen state. It’s prepared to receive requests from Azure AD (the sender), for verifications of authentication. We’ll cover this conversation a bit in the next few steps.When a synchronized user in the journeyofthegeek.com tenant accesses the Azure login screen and plugs in a set of credentials, Azure AD (the sender) connects to the relay and submits the authentication request. Like the initial MS AAD Application Proxy connection to the Azure Relay service, I was unable to capture the transactions in Fiddler. However, I was able to some of the conversation with API Monitor.

    I pieced this conversation together by reviewing API calls to the ncryptsslp.dll and looking at the output for the BCryptDecrypt method and input for the BCryptEncrypt method. While the data is ugly and the listeners have already been setup, we’re able to observe some of the conversation that occurs when the sender (Azure AD) sends messages to the listener (MS AAD Application Proxy) via the service (Azure Relay). Based upon what I was able to decrypt, it seems like one-way asynchronous communication where the MS AAD Application Proxy listens receives messages from Azure AD.

    Screen Shot 2017-04-05 at 9.38.40 PM

  11. The LogonUserW method is called from CLR.DLL and the user’s user account name, domain, and password is plugged. Upon a successful return and the authentication is valided, the MS AAD Application Proxy initiates an HTTP POST to
    his-eus-1.connector.his.msappproxy.net:10101/subscriber/connection?requestId=UNIQUEREQUESTID. The post contains a base64 encoded JWT with the information below. Unfortunately I was unable to decode the bytestream, so I can only guess what’s contained in there.{“JsonBytes”:[bytestream],”PrimarySignature”:[bytestream],”SecondarySignature”:null}

So what did we learn? Well we know that the Azure AD Pass-through authentication uses multiple Microsoft components including the MS AAD Application Proxy, Azure Service Bus (Relay), Azure AD, and Active Directory Domain Services. The authentication request is exchanged between Azure AD and the ApplicationProxyConnectorService.exe process running on the on-premises server via relay function of the Azure Service Bus.

The ApplicationProxyConnectorService.exe process authenticates to the URI where the bootstrap process occurs using a client certificate. After bootstrap the ApplicationProxyConnectorService.exe process obtains the shared access keys it will use to establish itself as a listener to the appropriate namespace in the Azure Relay. The process then establishes connection with the relay as a listener and waits for messages from Azure AD. When these messages are received, at the least the user’s password is encrypted with the public key of the client certificate (other data may be as well but I didn’t observe that).

When the messages are decrypted, the username, domain, and password is extracted and used to authenticate against AD DS. If the authentication is successful, a message is delivered back to Azure AD via the MS AAD Application Proxy service running in Azure.

Neato right? There are lots of moving parts behind this solution, but the finesse in which they’re integrated and executed make them practically invisible to the consumer. This is a solid out of the box solution and I can see why Microsoft markets in the way it does. I do have concerns that the solution is a bit of a black box and that companies leveraging it may not understand how troubleshoot issues that occur with it, but I guess that’s what Premier Services and Consulting Service is for right Microsoft? 🙂

Azure AD User Provisioning – Part 5

Hi everyone. I apologize for the delay in publishing the last post in this series. The past few months have been hectic. For this last post of the year I will be completing the series on provisioning in Azure AD.

As I’ve covered in earlier posts, there are a lot of options when provisioning to Azure AD including multiple GUIs and programmatic options. I’ve covered provisioning in the Azure Management Portal, Azure Portal, Office 365 Admin Center, and Azure Active Directory PowerShell v1 and v2. In this final post I will cover provisioning via the Graph API using a simple ASP .NET web application.

I was originally going to leverage the graph API directly via PowerShell using the .NET ADAL libraries and Invoke-WebRequest cmdlets. I’ve been playing around a lot with Visual Studio creating simple applications like the Azure B2B provisioning app. I decided to challenge myself by adding additional functionality to the ASP .NET web application I assembled in my previous post. I enjoyed the hell out of the process, learned a bunch more about .NET, C#, ASP .NET web apps, and applications built using the MVC architecture. Let’s get to it shall we?

Before we dive into the code and the methodologies I used to put together the application, let’s take a look at it in action. The application starts by requiring authentication to Azure AD.

1

After successful authentication, the main page for the website loads. You’ll notice from the interface that I used the sample ASP .NET MVC Web Application available in Visual Studio but added a new navigation link on the right hand side named Create User.

2

After clicking the Create User link, the user is redirected to a simple (i.e. ugly) web form where information about the new user is collected.

3

After the user hits submit, the new user is created in Azure AD and the information from the returned JWT is parsed and displayed in a table.

4

When we navigate to the Azure AD blade in the Azure Portal we see that Homer has been created and added to the system.

clearme

So you’re probably asking the question as to how complicated it was to put this application together? The answer may surprise you. It was incredibly simple. The most difficult part of the process was learning my away around C# and how MVC web apps are put together. For a skilled developer, this would have taken an hour versus the days it took me.

The first thing I did was do some reading into the Graph API, specifically around managing users. Microsoft has a number of great instructions located here and here. After getting familiar with the required inputs and the outputs, I built a new model in my application that would be used for the user form input.

6

Once I had my new model assembled, I then created two new views under a new folder named Create User. The view named Index is the view that takes the user input and the view named Results is the view that spits back some of the content from the JWT returned from Azure after the user is successfully created. Here is the code for the Index view.

7

And the code for the Results view.

8

After the new views were created, I then moved on to creating the guts of the new functionality with a new controller named CreateUserController. I was able to reuse some of the code from the UserProfileController to obtain the necessary OAuth access token to delegate the rights to the application to create the new user.

9

The remaining code in my controller came from a crash course in programming and MVC web apps. The first section of code calls the task to obtain the access token.

10

The next section of code creates a new instance of the user class and populates the properties with information collected from the form.

11.png

The final section of code attempts to create the new user and displays the results page with information about the user such as objectID and userPrincipalName. If the application is unable to create the user, the exception is caught, and an error page is displayed.

12

But wait… what is missing? I’ll give you a hint, it’s not code.

The answer is the appropriate delegated permissions. Even if the user is a global admin, the application can’t perform the actions of a global admin unless we allow it to. To make this happen, we’ll log into the Azure Portal, access the Azure AD blade, and grant the application the delegated permission to Access the directory as the signed-in user.

13

Simple right? The Azure Active Directory Graph Client libraries make the whole process incredibly easy doing a whole lot with very little code. Imagine integrating this functionality into an existing service catalog. Let’s say you have a business user who needs access to Dynamics CRM Online. The user could navigate to the service catalog and request access. After their manager approvers, the application powering the service catalog could provision the new user, assign the license for Dynamics CRM Online, and drop the user into the appropriate groups. All of this could happen without having to involve IT. This is the value of a simple API with a wonderful set of libraries.

Well folks that wraps up my last post the year. I’ll return next year with a series of deep dives exploring Microsoft’s newly announced Azure AD Pass-through authentication and SSO features. Have a happy holiday!

Azure AD User Provisioning – Part 4

Today I will continue my series in Azure AD User Provisioning. In previous posts I looked at the GUI methods of provisioning users. I’ll now begin digging into the methods that provide opportunities for programmatic management of a user’s identity management lifecycle. For this post we’ll cover every IT Professional’s favorite Microsoft topic, PowerShell.

Microsoft has specific PowerShell modules for administration of Azure Active Directory. Microsoft is in the process of transitioning from the MSOnline (Azure Active Directory PowerShell v1) to the AzureAD module (Azure Active Directory PowerShell v2). At this time the AzureAD PowerShell module is in public preview with plans to migrate all the functionality from the MSOnline module to the AzureAD module. Until that point there will be some activities you’ll need to do in the MSOnline cmdlets.
Creating a new user using the MSOnline module (lovely referred to as the “msol” cmdlets) is a quick and easy simple line of code:

New-MsolUser -userPrincipalName user@sometenant.com -DisplayName "Some User"

The user can then be modified by using cmlets such as Set-MSolUser, Set-MsolUserLicense, Set-MsolUserPrincipalName, Set-MsolUserPassword, Remove-MsolUser. Using this set of cmdlets you can assign and un-assign user licenses and manage the identity lifecycle of the user account.

There are some subtle differences in the MSOnline and AzureAD cmdlets. These differences arise due to Microsoft’s drive to make the experience using PowerShell in the Azure AD module similar to the experience of using the Graph API. The AzureAD cmdlets are much more full featured in comparison to the MSOnline cmdlets. Close to every aspect of Azure AD can be managed across the board beyond just users. Here is an example of how you would create a user with the new cmdlets:

New-AzureADUser -UserPrincipalName user@sometenant.com -AccountEnabled $False -DisplayName "user" -PasswordProfile $UserPasswordProfile

You’ll notice a few differences when we compare the minimum options required when creating a user with the MSOnline cmdlets. As you’ll notice above, there are not just more required options, but one that make look unfamiliar called PasswordProfile. The Password profile option configures the user’s password and whether or not the user is going to be forced to change the password at next login. I struggled a bit with getting the cmdlet to accept the PasswordProfile, but the documentation on the Azure Graph API and a Microsoft blog that provided some information. It can be set with the following lines of code:

$UserPasswordProfile = "" | Select-Object password,forceChangePasswordNextLogin
$UserPasswordProfile.forceChangePasswordNextLogin $true
$UserPasswordProfile.password = "some password"

So what is the big difference between the MSOnline and AzureAD cmdlets? Well it comes down to the API each uses to interact with Azure AD. The MSOnline (aka Azure AD PowerShell v1) uses the old SOAP API. Oddly enough it uses the same SOAP API that Azure AD Connect uses. If you’ve read my Azure AD Connect – Behind the Scenes series, you’ll know notice the similarities in the Fiddler capture below.

pica-1

The AzureAD cmdlets (aka Azure AD PowerShell v2) uses the Graph API. In the Fiddler capture below you can see the authentication to Azure AD, obtaining of the authorization code, exchange for bearer token, and delivery of the bearer token to the graph API for the user information on Dr. Frakenstein. Check out the screenshot from Fiddler below:

picb

As you can see, PowerShell presents a powerful method of managing Azure AD and its resources. Given how familiar IT professionals are with PowerShell, it presents a ton of opportunities for automation and standardization without a significant learning curve. Microsoft’s move to having PowerShell leverage the power of the Graph API will help IT professionals leverage the power of the Graph API without having to break out Visual Studio.

In my next post I’ll write a simple application in Visual Studio to demonstrate the simplicity of integration with the Graph API and the opportunities that could be presented to integrating with existing toolsets.

Azure AD User Provisioning – Part 3

In this entry I’m going to look at how provisioning users differs in the Azure Management Portal and the Azure Portal. The Azure Management portal was used heavily for all Azure administration prior to the introduction of the Azure Resource Manager deployment model a year or so ago. To my knowledge there isn’t much functionality that hasn’t been migrated to the Azure Portal exempting management of Azure Active Directory. This remaining piece is in the process of being moved to the Azure Portal and is currently in public preview with some limitations. This means that if you’re administering Azure AD you’re going to need to use the Management Portal for a while longer.

Unlike the Office 365 portal, the Management portal feels very dated. The initial dashboard that appears after authentication will list any classic deployment model resources and directories the authenticated user has control over.

pic10

First I will select one of the directories and dig into the interface. Immediately you’ll notice a number of configuration options available. Since I’m focused on user provisioning, I’ll very briefly describe the purpose of the other sections.

  • Groups – Used to manage the group lifecycle of Azure AD groups
  • Applications – Used to add new applications from the application gallery and register custom and third party applications
  • Domains – Used to manage additional DNS domains that have been associated with the tenant
  • Directory Integration – Used to configure support for synchronization using a tool such as Azure AD Connect
  • Configure – Used to manage the configuration of Azure AD including password reset policies, MFA, device authentication options, group management, who can invite guests, and the like
  • Reports – Used to run the many reports available with standard Azure AD and Azure AD Premium
  • Licenses – Used to assign Azure AD Premium licenses; not sure if any licenses beyond that, but does not seem capable of handling O365 licenses.

Now let’s get back to user provisioning. Next up I’ll head to the Users section. Here there is a listing of all members and guests within the directory.

pic11

To create a new user I’ll click the Add User icon at the bottom of the page which will bring up the window below where I can configure the user name.

pic12

In the next window I will add a first name, last name, display name, and pick a role. Notice anything different? Here the only options to configure the Azure AD roles as described here. There are no Office 365 roles to choose from here. Additionally the user can be enabled for Azure MFA (the checkbox is hidden under the listing of roles).

pic13

In the last window I’m prompted to create an auto-generated temporary password for the user. Notice the option to create a password and enforce password change at first sign in aren’t there like O365? After the create button is hit a password will be automatically generated and will need to be delivered to the user out of band. Quite basic when compared to the Office 365 Admin Center isn’t it?

pic14

After the user is created the user can be modified in the Profile and Work Info sections. Profile is for your basic information and configuration while Work Info is similar to the contact information section in the Office 365 Admin Center with some additional options to configure the users authentication phone number and email address. The Devices and Activity sections providing reporting on the user’s activities.

pic15

Let’s now ditch the old and embrace the new by examining provisioning in the Azure Portal. Prior to a few months back, the only some of the Azure AD functionality could be administered in the Azure Portal including Azure AD Privileged Identity Management, Azure AD Identity Protection, Azure AD Connect Health, Azure AD Cloud App Discovery and Azure AD B2C (which is even mixed with the Management Portal). Microsoft has recently begun to migrate the administration of Azure AD to the Azure Portal to centralize administration of Azure resources.

The Azure portal is accessible through by navigating to this link. After authentication the dashboard will load up displaying any resources that have been pinned. Click on the Azure Active Directory blade as highlighted in red in the screenshot below.

pic16

You’ll notice right off the bat that the interface is very slick, is intended for power users, and provides some useful summary analytics. I hadn’t poked around the new blade in a while and it looks like they’ve improved the functionality quite a bit. There doesn’t seem to be much missing beyond the ability to create new directories, assigning licenses, and reviewing the holistic audit logs. One item I did observe which is worth calling out is the app registration interface has been refined and made more slick. This is a big improvement from the similar interface in the Management Portal.

pic17.png

By navigating to the All Users blade and clicking the add button a new user can be created. This will bring up a new blade that allows for basic configuration of key pieces of information like user, first name, last name, job title, description, group membership, and Azure AD roles. The experience is quite similar to the Management Portal experience. Notice that the password again is pre-generated and does not allow setting a customer password or the option to turn off the enforcement of a password change at first login.

pic18

After the user is created it can be modified by clicking on the user which opens a new blade. This new blade allows for contact information about the user to be edited, assignment of Azure AD Roles, and assignment Azure AD group memberships. One neat feature is the Azure Resources option. This opens up a new blade that enumerates the user’s effective access to various Azure resources. Providing reporting on an effective user’s access is one thing Microsoft has never done effectively on-premises so a feature like this is nice to see, especially with the additional complexity the scale of cloud introduces. Finally, you’re provided some options to review the audit logs and sign in reports for the user (another neat feature). Like the Azure Management Portal, there is no quick and easy GUI-based functionality to restore deleted users in the Azure Portal at this time.

pic19

Well folks that is the overview of three out of four of the GUI provisioning methods. The fourth option is to provision natively through an on-premises Active Directory and synchronize those users to the cloud with a synchronization tool such as Azure AD Connect. There is plenty of documentation on what that process looks like already available. If you’re hungry for more, you can check out my previous series Azure Active Directory Connect – Behind the Scenes.

Let’s take a moment to summarize what we’ve learned:

Office 365 Admin Console

  • Simple and ideal for business users and Tier 1 support
  • Limited in its ability to administer Azure AD
  • Only GUI option for assigning Office 365 licenses
  • Only GUI option for assigning Office 365 roles
  • No B2B or B2C support
  • Bulk user creation capabilities
  • Best option for restoration of a deleted user

Azure Management Portal

  • Legacy portal being replace by Azure Portal
  • Only GUI option for creating additional standard and B2C directories
  • Only GUI option for adding B2B users
  • No support for bulk user creation or restoration of deleted users
  • Support of legacy Azure AD configuration items; no support for configuration of B2C policies, Identity Protection, Privileged Identity Management

Azure Portal

  • Future one-stop shop for Azure AD administration
  • Seems to supports all functionality of Azure Management Portal except creation of new directories
  • GUI options for B2C policies, management of Identity Protection, Privileged Identity Management, Azure AD Connect Health, and Azure AD Proxy
  • No GUI option for adding B2B users
  • No support for bulk user creation or restoration of deleted users
  • Analytics built into administrative tools
  • Robust application registration features

So what does this all mean? Well it means that if you need to administer identity functionality via the GUI, you’re going to need to use a combination of the Office 365 Admin Console, the Azure Management Portal, and the Azure Portal. I expect within the next 3-6 months the remaining functionality in Azure Management Portal will be completely migrated to the Azure Portal. Businesses should focus their Tier 1 and business staff on learning the Office 365 Admin Console while Tier 2 and Tier 3 staff should focus on learning the Azure Portal.

Now that I’ve dug into the GUI options, I’ll next explore how the APIs and PowerShell provide opportunities for automation and integration with 3rd party and custom identity management solutions that may already exist on premises. See you then!