Python Sample Web App and API for Azure AD B2C

Python Sample Web App and API for Azure AD B2C

Hello again folks.

I’ve recently had a number of inquiries on Microsoft’s AAD (Azure Active Directory) B2C (Business-To-Consumer) offering. For those infrastructure folks who have had to manage customer identities in the past, you know the pain of managing these identities with legacy solutions such as LDAP (Lighweight Directory Access Protocol) servers or even a collection of Windows AD (Active Directory) forests. Developers have suffered along with us carrying the burden of securely implementing the technologies into their code.

AAD B2C exists to make the process easier by providing a modern IDaaS (identity-as-a-service) offering complete with a modern directory accessible over a Restful API, support for modern authentication and authorization protocols such as SAML, Open ID Connect, and OAuth, advanced features such as step-up authentication, and a ton of other bells and whistles. Along with these features, Microsoft also provides a great library in the form of the Microsoft Authentication Library (MSAL).

It had been just about 4 years since I last experimented with AAD B2C, so I was do for a refresher. Like many people, I learn best from reading and doing. For the doing step, I needed an application I could experiment with. My first stop was the samples Microsoft provides. The Python pickings are very slim. There is a basic web application Ray Lou put together which does a great job demonstrating basic authentication (and I used as a base for my web app). However, I wanted to test additional features like step-up authentication and securing a custom-built API with AAD B2C.

So began my journey to create the web app and web API I’ll be walking through setting up with this post. Over the past few weeks I spent time diving into the Flask web framework and putting my subpar Python skills to work. After many late nights and long weekends spent reading documentation and troubleshooting with Fiddler, I finished the solution which costs of a web app and web API.

Solution Design

The solution is quite simple . It is intended to simulate a scenario where a financial services institution is providing a customer access to their insurance policy information. The insurance policy information is pulled from a legacy system that has been wrapped in a modern web-based API. The customer can view their account information and change the beneficiary on the policy.

AAD B2C provides the authentication to the application via Open ID Connect. Delegated access to the user’s policy information in the web API is handled through OAuth via an access token obtained by the web app from AAD B2C. Finally, when changing the beneficiary, the user is prompted for MFA which demonstrates the step-up authentication capabilities of AAD B2C.

The solution uses four User Flows. It has a profile editing user flow which allows the user to change information stored in the AAD B2C directory about the user, a password reset flow to allow the user to change the password for their local AAD B2C identity and two sign-up / sign-in flows. One sign-in / sign-up flow is enabled for MFA (multi-factor authentication) and one is not. The non-MFA enabled flow is the kicked off at login to the web app while the MFA enabled flow is used when the user attempts to change the beneficiary.

With the basics on the solution explained, let’s jump in to how to set it up. Keep in mind I’ll be referring to public documentation where it makes sense to avoid reinventing the wheel. For prerequisites you’ll need to have Python 3 installed and running on your machine, a code editor (personally I use Visual Studio Code), and an Azure AD tenant and Azure subscription.

The first think you’ll want to do is setup a AAD B2C directory as per these instructions. Once that is complete, you’ll now want to register the web app and web API. To do this you’ll want to switch to your B2C directory and select the App registrations link. On the next screen select the New registration link.

In the Register an application screen you’ll need to fill in some information. You can name the application whatever you’d like. Leave the Who can use this application or access this API set to the option in the screenshot since this will be a B2C app. In the redirect URI put in http://localhost:5000/getAToken. This is the URI AAD B2C will redirect the user to after the user successfully authenticates and obtains an ID token and access token. Leave the Grant admin consent to openid and offline_access permissions option checked. Once complete hit the Register button. This process creates an identity for the application in the B2C directory and authorizes it to obtain ID tokens and access tokens from B2C.

Register an application

Next we need to gather some information from the newly registered application. If you navigate back to App registration you’ll see your new app. Click on it to open up the application specific settings. On the next screen we get the Application ID (Client ID in OAuth terms). Record this because you’ll need it later.

App registration

So you have an client ID which identifies us to B2C, but you need to a credential to authenticate with. For that you’ll go to Certificates & secrets link. Click on the New client secret button to generate a new credential and save this for later.

You will need to register one additional redirect URI. This redirect URI is used when the user authenticates with MFA during the step-up process. Go back to the Overview and click on the Redirect URIs menu item on the top section.

Register additional URI

Once the new page loads, add a redirect URI which is found under the web section. The URI you will need to add is http://localhost:5000/getATokenMFA. Make sure to save your changes by hitting the Save button.

Now that your web app is registered, you need to register the web API. Once again, click the Register an application link under the Application view. You’ll want to select the same options except you do not need to provide a redirect URI. Click the Register button to complete the registration.

Once the app is registered, go into the application settings and click on the Expose an API link. Here you need to list two scopes which are included in the token and authorize access to the app to perform specific functions such as reading account information and writing account information in the API. Click the Add a scope button and you’ll be prompted to set an Application ID URI which you need to set to api. Once you’ve set it, hit the Save and continue button.

Application ID URI

The screen will refresh you’ll be able to add your first scope. I have defined two scopes within the web API, one called Accounts.Read which grants access to read account information and one for Accounts.Write which grants access to edit account information. Create the scope for the Accounts.Read and repeat the process for Accounts.Write.

By default B2C grants application registered with it the offline_access and openid permissions for Microsoft Graph. Since our API won’t be authenticating the user and will simply be verifying the access token passed by the web app, we could remove those permissions if we want. You can do this through the API permissions link which is located on the application settings page.

The last step you have in the B2C portion of Azure is to grant your web app the permission to request an access token for the Accounts.Read and Accounts.Write scopes used by the web API. To do this you need to go back into the application settings for the web app and go to the API permissions link. Click the Add a permission button. In the Request API permissions window, select My APIs option and select the web API you registered. From there select the two permissions and click the Add permissions button.

To finish up with the permissions piece you’ll grant admin consent to permissions. At the API permissions window, click the Grant admin consent for <YOUR TENANT NAME> button.

Admin consent to permissions

At this point we’ve registered the web app and web API with Azure B2C. We now need to enable some user flows. Azure B2C has an insanely powerful policy framework that powers the behavior of B2C behind the scenes that allow you to do pretty much whatever you can think of. User flows are predefined policies that provide common tasks without you having to go an learn a new policy framework.

For this solution you will need to setup four user flows. You’ll need to create a sign-up and sign-in flow with MFA disabled, another with MFA-enabled, a password reset flow, and profile editing flow. Name the flows as listed in the image below.

User flow list

The sign-up and sign-in flow can be configured to collect and return specific claim values in the user’s ID token. You configure this by clicking on the user flow you want to modify and navigating to the user attributes and application claims sections as seen in the image below.

Sign in / Sign up claims/attributes

For this solution ensure you are collecting and returning in claims at least the Display Name, Email Address, Given Name, and Surname. Configure this in both the MFA and non-MFA enabled sign-up and sign-in policies.

Now it’s time to get the apps up and running. For this section I’ll be assuming you’re using Visual Studio Code.

Open up an instance of Visual Studio Code and clone the repository https://github.com/mattfeltonma/python-b2c-sample. For lazy people like myself, I use the CTRL+SHIFT+P to open up the command windows in Visual Studio, search for the Git:Clone command and type in the directory.

Once the repo has cloned you’ll want to open two Terminal instances. You can do this with CTRL+SHIFT+` hotkey. In your first terminal navigate python-b2c-web-app directory. Create a new directory named flask_session. This directory will serve as the store for server side sessions for the flask-sessions module.

In the same directory create a new directory named env and create a Python virtual environment. Repeat the process in the other terminal under the python-simple-web-api directory.

You do this using the command:

python -m venv env

Once the virtual environments will need to activate the virtual environments. You will need to do this for both the web app and web API. Do this using the following command while you are under the relevant b2c-web-app or simple-web-api directories:

env\Scripts\activate

Next, load the required libraries. Do this using the command below. Repeat it again for both the web app and web api.

pip install -r requirements.txt

The environments are now ready to go. Next up you need to set some user variables. Within the terminal for the b2c-web-app create user variables for the following:

  • CLIENT_ID set to client secret of the web app. This is available in the B2C portal
  • CLIENT_SECRET set to the secret of the web app. If you didn’t record this when you generated it you will need to create a new credential.
  • B2C_DIR set to the name of your B2C tenant. This should be a single label name such as myb2c
  • FLASK_APP set too app.py. This tells Flask what file to execute.

Within the terminal for the simple-web-api create user variables for the following:

  • TENANT_NAME set to the name of your B2C tenant. This should be a single label name such as myb2c
  • TENANT_ID set to the tenant ID of your B2C tenant.
  • B2C_POLICY set to the name of your non-MFA B2C policy. For example, B2C_1_signupsignin1.
  • CLIENT_ID set to the client ID of the web API. This is available in the B2C portal
  • FLASK_APP set to app.py. This tells Flask what file to execute.

In Windows you can set these variables by using the following command:

set FLASK_APP=app.py

The last step you need to take is to modify the accounts.json file in the python-simple-web-api directory. You will need to add sample records for each user identity you want to test. The email value of the account object must match the email address being presented in the access token.

Now you need to startup the web server. To do this you’ll use the flask command. In the terminal you setup for the python-b2c-web-app, run the following command:

flask run -h localhost -p 5000

Then in the terminal for the python-simple-web-api, run the following command:

flask run -h localhost -p 5001

You’re now ready to test the app! Open up a web browser and go to http://localhost:5000. You’ll be redirected to the login page seen below.

Clicking the Sign-In button will open up the B2C sign-in page. Here you can sign-in with an existing B2C account or create a new one.

After successfully logging in you’ll be presented with a simple home page. The Test API link will bring you to the public endpoint of the web API validating that the API is reachable and running. The Edit Profile link will redirect you to the B2C Edit Profile experience. Clicking the My Claims link will display the claims in your ID token as seen below.

Clicking the My Account calls the web API and queries the accounts.json file for a record for the user. The contents of the record are then displayed as seen below.

Clicking on the Change Beneficiary button will kick off the second MFA-enabled sign-in and sign-up user flow prompting the user for MFA. After successful MFA, the user is redirected to a page where they make the change to the record. Clicking the submit button modifies the accounts.json file with the new value and the user is redirected to the My Account page.

Well folks that’s it! Experiment, enjoy, and improve it. Over the next few weeks I’m going planning on putting some newly practiced Docker skills to the test by containerizing the applications.

You can get the solution here.

Thanks everyone!

Deep Dive into Azure AD and AWS SSO Integration – Part 4

Deep Dive into Azure AD and AWS SSO Integration – Part 4

Today we continue exploring the new integration between Microsoft’s Azure AD (Azure Active Directory) and AWS (Amazon Web Services) SSO (Single Sign-On).  Over the past three posts I’ve covered the high level concepts of both platforms, the challenges the integration seeks to solve, and how to enable the federated trust which facilitates the single sign-on experience.  If you haven’t read through those posts, I recommend you before you dive into this one.  In this post I’ll be covering the neatest feature of the new integration, which is the support for automated provisioning.

If you’ve ever worked in the identity realm before, you know the pains that come with managing the life cycle of an identity from initial provisioning, changes resulting to the identity such as department and position changes, to the often forgotten stage of de-provisioning.  On-premises these problems were used solved by cobbled together scripts or complex identity management solution such as SailPoint Identity IQ or Microsoft Identity Manager.  While these tools were challenging to implement and operate, they did their job in the world of Windows Active Directory, LDAP, SQL databases and the like.

Then came cloud, and all bets were off.  Identity data stores skyrocketed from less than a hundred to hundreds and sometimes thousands (B2C has exploded far beyond event that).  Each new cloud service introduced into the enterprise introduced yet another identity management challenge.  While some of these offerings have APIs that support identity management operations, most do not, and those that do are proprietary in nature.  Writing custom code to each of the APIs is a huge challenge that most enterprises can’t keep up.  The result is often manual management of an identity life cycle, through uploading exported CSV files or some poor soul pointing and clicking a thousand times in a vendor portal.

Wouldn’t it be great if there was some mythical standard out that would help to solve this problem, use a standard API through REST, and support the JSON format?  Turns out there is and that standard is SCIM (System for Cross-domain Identity Management).  You may be surprised to know the standard has been around for a while now (technically 2011).  I recall hearing about it at a Gartner conference many many hears ago.  Unfortunately, it’s taken a long time to catch on but support is steadily increasing.

Thankfully for us, Microsoft has baked support into Azure AD and AWS recognized the value and took advantage of the feature.  By doing this, the identity life cycle challenges of managing an Azure AD and AWS integration has been heavily re-mediated and our lives made easier.

Azure AD Provisioning - Example

Azure AD Provisioning – Example

Let’s take a look at how set it up, shall we?

The first place you’ll need to go is into the AWS account which is the master for the organization and into the AWS SSO Settings.  In Settings you’ll see the provisioning option which is initially set as manual.  Select to enable automatic provisioning.

AWS SSO Settings - Provisioning

AWS SSO Settings – Provisioning

Once complete, a SCIM endpoint will be created.  This is the endpoint in AWS (referred to as the SCIM service provider in the SCIM standard) that the SCIM service on Azure AD (referred to as the client in the SCIM standard) will interact with to search for, create, modify, and delete AWS users and groups.  To interact with this endpoint, Azure AD must authenticate to it, which it does with a bearer access token that is issued by AWS SSO.  Be aware that the access token has a one year life span, so ensure you set some type of reminder.  A quick search through the boto3 API doesn’t show a way to query for issued access tokens (yes you can issue more than one at at time) so you won’t be able to automate the process as of yet.

awssso-scimendpoint.png

After SCIM is enabled, AWS SSO Settings for provisioning now reports SCIM in use.

awssso-scimenabled.png

Next you’ll need to bounce over to Azure AD and go into the enterprise app you created (refer to my third post for this process).   There you’ll navigate to the Provisioning blade and select Automatic as the provisioning method.

azuread-scimprov.png

You’ll then need to configure the URL and access token you collected from AWS and test the connection.  This will cause Azure AD to test querying the endpoint for a random user and group to validate functionality.

azuread-scimtest.png

If your test is successful you can then save the settings.

azuread-scimtestsucccess.PNG

You’re not done yet.  Next you have to configure a mapping which map attributes in Azure AD to the resource and attributes in the SCIM schema.  Yes folks, SCIM does have a schema for attributes and resources (like users and groups).  You can extend it as needed, but in this integration it looks to be using the default user and group resources.

azuread-scimmappings

Let’s take a look at what the group mappings look like.

azuread-scimgroupmappings.PNG

The attribute names on the left are the names of the attributes in Azure AD and the attributes on the right are the names of the attributes Azure AD will write the values of the attributes to in AWS SSO.  Nothing too surprising here.

How about the user mappings?

azuread-scimusermappings1azuread-scimusermappings2

Lots more attributes in the user mappings by default.  Now I’m not sure how many of these attributes AWS SSO supports.  According to the SCIM standard, a client can attempt to write whatever it wants and any attributes the service provider doesn’t understand is simply discarded.  The best list of attributes I could find were located here, and it’s not near this number.  I can’t speak to what the minimum required attributes are to make AWS work, because their official instructions on this integration doesn’t say.  I know some of the product team sometimes reads the blog, so maybe we’ll luck out and someone will respond with that answer.

The one tweak you’ll need to make here is to delete the mailNickName mapping and replace it with a mapping of objectId to externalId.  After you make the change, click the save icon.

I don’t know why AWS requires this so I can only theorize.  Maybe they’re using this attribute as a primary key in the back end database or perhaps they’re using it to map the users to the groups?  I’m not sure how Azure AD is writing the members attribute over to AWS.  Maybe in the future I’ll throw together a basic app to visualize what the service provider end looks like.

newmapping.PNG

Now you need to decide what users and groups you want to sync to AWS SSO.  Towards the bottom of the provisioning blade, you’ll see the option to toggle the provisioning status.  The scope drop down box has an option to sync all users and groups or to sync only assigned users and groups.  Best practice here is basic security, only sync what you need to sync, so leave the option on sync only assigned users and group.

The assigned users and groups refers to users that have been assigned to the enterprise application in Azure AD.  This is configured on the Users and Groups blade for the enterprise app.  I tested a few different scenarios using an Azure AD dynamic group, standard group, and a group synchronized from Windows AD.  All worked successfully and synchronized the relevant users over.

Once you’re happy with your settings, toggle the provisioning status and save the changes.  It may take some time depending on how much you’re syncing.

syncsuccess.PNG

If the sync is successful, you’ll be able to hop back over to AWS SSO and you’ll see your users and groups.

awssyncedusersawssyncedgroup

Microsoft’s official documentation does a great job explaining the end to end cycle.  The short of it is there’s an initial cycle which grabs all users and groups from Azure AD, then filters the list down to the users and groups assigned to the application.  From there it queries the target system to match the user with the matching attribute and if it isn’t found creates it, and if found and needs updating, updates it.

Incremental cycles are down from that point forward every 40 minutes.  I couldn’t find any documentation on how to adjust the synchronization frequency.  Be aware of that 40 minute sync and consider the end to end synchronization if you’re sourcing from Windows Active Directory.  In that case making changes in Windows AD could take just over an hour (assuming you’re using the 30 minute sync interval in Azure AD Connect) to fully synchronize.

awsssotime.PNG

As I described in my third post, I have a lab environment setup where a Windows Active Directory domain is syncing to Azure AD.  I used that environment to play out a few scenarios.

In the first scenario I disabled Marge Simpson’s account.  After waiting some time for changes to synchronize across both platforms, I saw in AWS SSO that Marge Simpson was now disabled.

margedisabled.PNG

For another scenario, I removed Barney Gumble from the Network Operators Active Directory group.  After waiting time for the sync to complete, the Network Operators group is now empty reflecting Barney’s removal from the group.

networkoperators.PNG

Recall that I assigned four groups to the app in Azure AD, Network Operators, Security Admins, Security Auditors, and Systems Operators.  These are the four groups syncing to AWS SSO.  Barney Gumble was only a member of the Network Operators group, which means removing him put him out of scope for the app assignment.  In AWS SSO, he now reports as being disabled.

barneydisabled.PNG

For our final scenario, let’s look at what happens when I deleted Barney Gumble from Windows Active Directory.  After waiting the required replication time, Barney Gumble’s user account was still present in AWS SSO, but set as disabled.  While Barney wouldn’t be able to login to AWS SSO, there would still be cleanup that would need to happen on the AWS SSO directory to remove the stale identity records.

barneydisabled.PNG

The last thing I want to cover is the logging capabilities of the SCIM service in Azure AD.  There are two separate logs you can reference.  The first are the Provisioning Logs which are currently in preview.  These logs are going to be your go to to troubleshoot issues with the provisioning process.  They’re available with an Azure AD P1 or above license and are kept for 30 days.  Supposedly they’re kept for free for 7 days, but the documentation isn’t clear whether or not you have the ability to consume them.  I also couldn’t find any documentation on if it’s possible to pull the logs from an API for longer term retention or analysis in Log Analytics or a 3rd party logging solution.

If you’ve ever used Azure AD, you’ll be familiar with the second source of logs.  In the Azure AD Audit logs, you get additional information, which while useful, is more catered to tracking the process vs troubleshooting the process like the provisioning logs.

Before I wrap up, let’s cover a few key findings:

  • The access token used to access the SCIM endpoint in AWS SSO has a one year lifetime.  There doesn’t seem to be a way to query what tokens have been issued by AWS SSO at this time, so you’ll need to manage the life cycle in another manner until the capability is introduced.
  • Users that are removed from the scope of the sync, either by unassigning them from the app or deleting their user object, become disabled in AWS SSO.  The records will need to be cleaned up via another process.
  • If synchronizing changes from a Windows AD the end to end synchronization process can take over an hour (30 minutes from Windows AD to Azure AD and 40 minutes from Azure AD to AWS SSO).

That will wrap up this post.  In my opinion the SCIM service available in Azure AD is extremely under utilized.  SCIM is a great specification that needs more love.  While there is a growing adoption from large enterprise software vendors, there is a real opportunity for your organization to take advantage of the features it offers in the same way AWS has.  It can greatly ease the pain your customers and enterprise users experience having to manage the life cycle of an identity and makes for a nice belt and suspenders to modern identity capabilities in an application.

In the last post of my series I’ll demonstrate a few scenarios showing how simple the end to end experience is for users.  I’ll include some examples of how you can incorporate some of the advanced security features of Azure AD to help protect your multi-cloud experience.

See you next post!

 

Deep Dive into Azure AD and AWS SSO Integration – Part 3

Deep Dive into Azure AD and AWS SSO Integration – Part 3

Back for more are you?

Over the past few posts I’ve been covering the new integration between Azure AD and AWS SSO.  The first post covered high level concepts of both platforms and some of the problems with the initial integration which used the AWS app in the Azure Marketplace.  In the second post I provided a deep dive into the traditional integration with AWS using a non-Azure AD security token service like AD FS (Active Directory Federation Services), what the challenges were, how the new integration between Azure AD and AWS SSO addresses those challenges, and the components that make up both the traditional and the new solution.  If you haven’t read the prior posts, I highly recommend you at least read through the second post.

Azure AD and AWS SSO Integration

New Azure AD and AWS SSO Integration

In this post I’m going to get my hands dirty and step through the implementation steps to establish the SAML trust between the two platforms.  I’ve setup a fairly simple lab environment in Azure.  The lab environment consists of a single VNet (virtual network) with a four virtual machines with the following functions:

  • dc1 – Windows Active Directory domain controller for jogcloud.com domain
  • adcs – Active Directory Certificate Services
  • aadc1 – Azure Active Directory Connect (AADC)
  • adfs1 – Active Directory Federation Services

AADC has been configured to synchronize to the jogcloud.com Azure Active Directory tenant.  I’ve configured federated authentication in Azure AD with the AD FS server acting as an identity provider and Windows Active Directory as the credential services provider.

visio of lab environment

Lab Environment

On the AWS side I have three AWS accounts setup associated with an AWS Organization.  AWS SSO has not yet been setup in the master account.

Let’s setup it up, shall we?

The first thing you’ll need to do is log into the AWS Organization master account with an account with appropriate permissions to enable AWS SSO for the organization.  If you’ve never enabled AWS SSO before, you’ll be greeted by the following screen.

1.png

Click the Enable AWS SSO button and let the magic happen in the background.  That magic is provisioning of a service-linked role for AWS SSO in each AWS account in the organization.  This role has a set of permissions which include the permission to write to the AWS IAM instance in the child account.  This is used to push the permission sets configured in AWS SSO to IAM roles in the accounts.

Screenshot of AWS SSO IAM Role

AWS SSO Service-Linked IAM Role

After about a minute (this could differ depending on how many AWS accounts you have associated with your organization), AWS SSO is enabled and you’re redirected to the page below.

Screenshot of AWS SSO successfully enabled page

AWS SSO Successfully Enabled

Now that AWS SSO has been configured, it’s time to hop over to the Azure Portal.  You’ll need to log into the portal as a user with sufficient permissions to register new enterprise applications.  Once logged in, go into the Azure Active Directory blade and select the Enterprise Applications option.

Register new Enterprise Application

Register new Enterprise Application

Once the new blade opens select the New Application option.

Register new application

Register new application

Choose the Non-gallery application potion since we don’t want to use the AWS app in the Azure Marketplace due to the issues I covered in the first post.

Choose Non-gallery application

Choose Non-gallery application

Name the application whatever you want, I went with AWS SSO to keep it simple.  The registration process will take a minute or two.

Registering application

Registering application

Once the process is complete, you’ll want to open the new application and to go the Single sign-on menu item and select the SAML option.  This is the menu where you will configure the federated trust between your Azure AD tenant and AWS SSO on the Azure  AD end.

SAML Configuration Menu

SAML Configuration Menu

At this point you need to collect the federation metadata containing all the information necessary to register Azure AD with AWS SSO.  To make it easy, Azure AD provides you with a link to directly download the metadata.

Download federation metadata

Download federation metadata

Now that the new application is registered in Azure AD and you’ve gotten a copy of the federation metadata, you need to hop back over to AWS SSO.  Here you’ll need to go to Settings.  In the settings menu you can adjust the identity source, authentication, and provisioning methods for AWS SSO.  By default AWS SSO is set to use its own local directory as an identity source and itself for the other two options.

AWS SSO Settings

AWS SSO Settings

Next up, you select the Change option next to the identity source.  As seen in the screenshot below, AWS SSO can use its own local directory, an instance of Managed AD or BYOAD using the AD Connector, or an external identity provider (the new option).  Selecting the External Identity Provider option opens up the option to configure a SAML trust with AWS SSO.

Like any good authentication expert, you know that you need to configure the federated trust on both the identity provider and service provider.  To do this we need to get the federation metadata from AWS SSO, which AWS has been lovely enough to also provide it to us via a simple download link which you’ll want to use to get a copy of the metadata we’ll later import into Azure AD.

Now you’ll need to upload the federation metadata you downloaded from Azure AD in the Identity provider metadata section.  This establishes the trust in AWS SSO for assertions created from Azure AD.  Click the Next: Review button and complete the process.

AWS SSO Identity Sources

Configure SAML trust

You’ll be asked to confirm changing the identity source.  There a few key points I want to call out in the confirmation page.

  • AWS SSO will preserve your existing users and assignments -> If you have created existing AWS SSO users in the local directory and permission sets to go along with them, they will remain even after you enable it but those users will no longer be able to login.
  • All existing MFA configurations will be deleted when customer switches from AWS SSO to IdP.  MFA policy controls will be managed on IdP -> Yes folks, you’ll now need to handle MFA.  Thankfully you’re using Azure AD so you plenty of options there.
  • All items about provisioning – You have to option to manually provision identities into AWS SSO or use the SCIM endpoint to automatically provision accounts.  I won’t be covering it, but I tested manual provisioning and the single sign-on aspect worked flawless.  Know it’s an option if you opt to use another IdP that isn’t as fully featured as Azure AD.

Confirmation prompt

Confirmation prompt

Because I had to, I popped up the federation metadata to see what AWS requiring in the order of claims in the SAML assertion.  In the screenshot below we see is requesting the single claim of nameid-format:emailaddress.  This value of this claim will be used to map the user to the relevant identity in AWS SSO.

AWS SSO Metadata

Back to the Azure Portal once again where you’ll want to hop back to Single sign-on blade of the application you registered.  Here you’ll click the Upload metadata file button and upload the AWS metadata.

Uploading AWS federation metadata

Uploading AWS federation metadata

After the upload is successful you’ll receive a confirmation screen.  You can simple hit the Save button here and move on.

Confirming SAML

Confirming SAML

At this stage you’ve now registered your Azure AD tenant as an identity provider to AWS SSO.  If you were using a non-Azure AD security token service, you could now manually provision your users AWS SSO, create the necessary groups and permissions sets, and administer away.

I’ll wrap up there and cover the SCIM provisioning in the next post.  To sum it up, in this post we configured AWS SSO in the AWS Organization and established the SAML federated trust between the Azure AD tenant and AWS SSO.

See you next post!

Deep Dive into Azure AD and AWS SSO Integration – Part 2

Deep Dive into Azure AD and AWS SSO Integration – Part 2

Welcome back folks.

Today I’ll be continuing my series on the new integration between Azure AD and AWS SSO.  In my last post I covered the challenges with the prior integration between the two platforms, core AWS concepts needed to understand the new integration, and how the new integration addresses the challenges of the prior integration.

In this post I’m going to give some more context to the challenges covered in the first post and then provide an overview of the what the old and new patterns look like.  This will help clarify the value proposition of the integration for those of you who may still not be convinced.

The two challenges I want to focus on are:

  1. The AWS app was designed to synchronize identity data between AWS and Azure AD for a single AWS account
  2. The SAML trust between Azure AD and an AWS account had to be established separately for each AWS account.

Challenge 1 was unique to the Azure Marketplace AWS app because they were attempting to solve the identity lifecycle management problem.  Your security token service (STS) needs to pass a SAML assertion which includes the AWS IAM roles you are asserting for the user.  Those roles need to be mapped to the user somewhere for your STS to tap into them.  This is a problem you’re going to feel no matter what STS you use, so I give the team that put together the AWS app together credit for trying.

The folks over at AWS came up with an elegant solution requiring some transformation in the claims passed in the SAML token and another solution to store the roles in commonly unused attributes in Active Directory.  However, both solutions suffered the same problem in that you’re forced to workaround that mapping, which becomes considerably difficult as you began to scale to hundreds of AWS accounts.

Challenge 2 plagues all STSs because the SAML trust needs to be created for each and every AWS account.  Again, something that begins to get challenging as you scale.

AWS Past Integration

AWS Past Integration

In the image above, we see an example of how some enterprises addressed these problems.  We see that there is some STS in use acting as an identity provider (idP) (could be Azure AD, Okta, Ping, AD FS, whatever) that has a SAML trust with each AWS account.  The user to AWS IAM role mappings are included in an attribute of the user’s Active Directory user account.  When the user attempts to access AWS, the STS queries Active Directory for the information.  There is a custom process (manual or automated) that queries each AWS account for a list of AWS IAM Roles that are associated with the IdP in the AWS account.  These roles are then populated in the attribute for each relevant user account.  Lastly, CloudFormation is used to push IAM Roles to each AWS account.  This could be pushed through a manual process or a CI/CD pipeline.

Yeah this works, but who wants all that overhead?  Let’s look at the new method.

Azure AD and AWS SSO Integration

Azure AD and AWS SSO Integration

In the new integration where we use Azure AD and AWS SSO together, we now only need to establish a single SAML trust with AWS SSO.  Since AWS SSO is integrated with AWS Organizations it can be used as a centralized identity source for all AWS accounts within the organization.  Additionally, we can now leverage Azure AD to manage the synchronization of identity data (users and groups) from Azure AD to AWS SSO.  We then map our users or groups to permission sets (collections of IAM policies) in AWS SSO which are then provisioned as IAM roles in the relevant AWS accounts.  If we want to add a user to a role in AWS IAM, we can add that user to the relevant group in Azure AD and wait for the synchronization process to occur.  Once it’s complete, that user will have access to that IAM role in the relevant accounts.  A lot less work, right?

Let’s sum up what changes here:

  • We can use existing processes already in place to move users in and out of groups either on-premises in Windows AD (that is syncing to Azure AD with Azure AD Connect) or directly in Azure AD (if we’re not syncing from Windows AD).
  • Group to role mappings are now controlled in AWS SSO
  • Permission sets (or IAM policies for the IAM roles) are now centralized in AWS SSO
  • We no longer have to provision the IAM roles individually into each AWS account, we can centrally control it in AWS SSO

Cool right?

In my few posts I’ll begin walking through the integration an demonstrating some the solution.

Thanks!