Part Three: Enterprise Authorization Scenarios with James McGovern

Here is the third installment in a series of conversations I have had with James McGovern, enterprise architect extraordinaire. In this post, we expand the scope from insurance scenarios to include some broader enterprise contexts for externalized authorization.

JM: Over the last couple of years, I have had lots of fascinating conversations with Architects in Fortune enterprises regarding their Identity Management deployment and several common themes have emerged including that while they could do basic provisioning of an account in Active Directory, they couldn’t manage to successfully provision many enterprise applications due to challenges that go beyond simplistic identity. Can XACML help get them to the next stage of maturity?

GG: Your question reminds me of the latest round of commentary regarding the futility of a provisioning approach to identity management. Particularly from the latest Gartner IAM summit, speakers were lamenting the state of the provisioning market and how little progress has been made over the last 10 years. At the heart of the problem is the fact that provisioning tools just don’t have visibility into application privileges and entitlements, in the vast majority of deployments. Instead, provisioning deployments tend to “skim the surface” by managing userIDs/passwords, but defer deep entitlement settings to the target application or platform. Of course, the most difficult applications to manage are an issue because they don’t properly externalize identity management functions – making provisioning deployments more expensive as well as less than optimal.

Enter the “pull” model espoused by my former colleague, Bob Blakley. The basic premise of the pull model is that identity data is resolved at runtime by calling the appropriate service. If a user accesses an application before authenticating, redirect them to an authentication service. If a user accesses the application with a known token, redirect to the token service for proper handling. When the user attempts to perform a protected function, an authorization service should be called for evaluation.

As the reader may have surmised, the more an application externalizes identity – the less provisioning is required. Instead of provisioning accounts and entitlements to every application, a smaller number of authoritative identity service points are provisioned that can be leveraged by many applications. COTS applications would come preconfigured with policies for entitlements and authorization, instead of using a proprietary, embedded approach. To extend this further, access controls for COTS applications from different vendors can be implemented consistently – without excess access “leaks” – if they share a centralized access control model.

Therefore the ability to centrally describe the authorization model of an enterprise application would help. The challenge of identity management would significantly change in a number of ways. For example, enterprises would need to establish with identity services they would provision for the purposes of authentication – and which established, external identity providers they would consume. Authoritative attribute sources would fall into the same category. Finally, authorization policy modeling and management skills would become more prominent so that a normalized view could be attained across the enterprise.

JM: I remember conversations with my former boss at The Hartford where he asked me to explain the value proposition of identity management. He didn’t understand the value of spending millions of dollars for a system to tell him that James McGovern is still an employee. After all, he would know whether he fired me or not. What he wanted to know is what James McGovern could access if he decided to fire me. More importantly, even being in the role of Chief Security Architect, I couldn’t always figure out what I had access to.

GG: Sure, your boss would know whether he fired you or not – but what about all those independent insurance agents we’ve discussed in previous scenarios? Dealing with hundreds, thousands or millions of users and managing what they have access to is what drives organizations to spend significant sums on identity management. That said, there is often a budget imbalance because internal systems are more complex and expensive to operate than the applications serving external constituencies.

Determining what resources a particular user has access to, or who has access rights to a resource are questions that auditors, as well as system administrators, want answers to. Administrators need to know this detail so they can properly set up access for a new employee, contractor, customer, etc. Of course they also need to know this information so de-provisioning can also occur when the relationship is terminated. Auditors and regulators are responsible for ensuring that the organization is following internal business and security policies, as well as regulations or laws they may be subject to.

Current practices, where identity and policy are embedded in each business application, have proven to be very inefficient when attempting to audit the environment. It is not unusual for large organizations to have several hundred or a few thousand applications – imagine trying to audit such an environment on a regular basis if the identity information and policy is not externalized? The situation can be utterly insane if each application has its own store of user data, because then you have a synchronization and reconciliation challenge. Herein lies one of the main value propositions of externalizing authorization: audit-ability and accountability are much easier to accomplish because you have a central place where policies are defined and maintained (although the enforcement of those policies can certainly be distributed). Further, when you think of combining externalized authorization and identity governance, then you can achieve even more visibility and transparency into access controls.

JM: Many Architects have enterprise initiatives to reduce the number of signons to enterprise applications a user has to provide in any given day. Once they have implemented the solution to carry a token that provides seamless access between systems, they now discover that they have an authorization problem. What role should XACML play in an identity strategy?

GG: Sounds like that famous Vegas game, whack-a-mole. As soon as you think you have solved one problem, a new one appears… The scenario you describe can occur if the architects have not fully mapped out their strategy or understood the full consequences (intended or otherwise) of the architecture. If you move to a tokenized authentication approach (like SAML), then you have accomplished two worthy goals: reduced signon for users and fewer systems to provision a credential to.

However, as you point out, the application still needs to do authorization in some way. This could be accomplished if the application retains some kind of user store and keeps entitlement or personalization attributes about the user – at least the application is not storing or managing a credential. Thinking back to the issue of hundreds or thousands of applications, this doesn’t sound like a good solution for a number of reasons.

The preferred approach, if you have externalized authentication, is to also externalize authorization and utilize an XACML system. When the user presents their SSO token of choice to the application, it can call out to an XACML policy engine (such engine could also be embedded in the application for lowest latency) for the authorization engine. This is the approach we see more and more organizations taking.

JM: The average Fortune insurance enterprise may have hundreds of enterprise applications where maybe only 20% are commercial off-the-shelf (COTS) products. Vendors such as Oracle, IBM and SAP are providing out-of-the-box integration with XACML in future releases of their ERP, CRM, BPM, ECM and Portal products. Other large enterprise vendors however seem to be missing in action. How do I fill in the gaps?

GG: This is where you need to rely on your authorization vendor to provide PEPs for COTS applications that don’t directly support XACML. In some cases, you can use a proxy, custom PEP code or an XML gateway (such as from Layer 7, Intel or Vordel) to intercept calls to the application. In other cases, a hybrid approach is necessary because the application cannot operate unless users are provisioned into certain roles or groups.

Ultimately application customers have a lot of influence with their vendors on what standards should be supported. Enterprises should use what leverage they have to encourage XACML adoption where appropriate – that leverage could come in the form of willingness to buy the application if standards are supported vs. building it internally if the required standards are not included.

JM: Many corporations have SoX controls not only around IT systems but also physical security. Does XACML have a value here?

GG: There is definitely a use case where XACML authorization is the policy engine for converged physical/logical access systems. We are seeing some interest for this capability in certain defense scenarios and are working with a physical access vendor on some prototypes. The idea is that access decisions be determined not only based on typical logical access rules, but also based on where you are located. For example, the batch job for printing insurance claim checks will only be released once I have badged into the print room.

JM: So far, the conversation around identity has dominated many of the industry conferences and analyst research. The marketplace at large is blissfully ignorant to the challenges of managing entitlements within an enterprise context. What do you think will be the catalyst for this getting more airtime in front of IT executives?

GG: I think there are significant challenges that are forcing the issue of entitlements to the surface.

  1. The need to share: an organization’s most valuable and sensitive data is precisely the data that partners, customers, and suppliers want access to. The business imperative is to share this data securely.
  2. Overexposure of data: The counterpoint to the first item is that too much data is exposed – that is, sharing of sensitive data must be very granular so that the proper access is granted, but no more.
  3. Sensitivity of data: We are in an era of seemingly continuous incidents of personal data release – either accidentally or due to poor security controls. Insurance companies collect lots of personal data such as what car you drive, where you work, what valuables you have in your home, medical information, insurance policy data, workers compensation data, etc. All this data needs to be protected from improper disclosure.
  4. Moving workloads to the cloud: Regardless of all the hype around cloud computing, there is a strong drive to utilize the capabilities of this latest computing movement. What’s particular to entitlements surfaces in at least two areas. First, it is almost impossible to move workloads out of their traditional data center if entitlements and other IdM functions are “hard wired” into the application, because the application will cease to function. Second, once applications are moved to the cloud, you need to have a consistent way to enforce access – regardless of where the applications and data are hosted. This cries out for a common entitlement and authorization model that can be applied to all resources.

JM: I have some really wild scenarios in the back of my head on how XACML could enable better protections in relational databases, be used for implementing user privacy rights in enterprise applications and even how it could be used  as a way to provide digital rights management. What are some of the more novel uses of XACML that are on your radar that you think the information security community should be thinking of?

GG: The XACML policy language and architectural model are incredibly flexible and applicable to many business scenarios. Databases pose a particular challenge, but there are certainly creative ways to address this and it would be great to explore some of your ideas. Privacy scenarios have their own challenges because you can have legal restrictions on PII as well as user preferences to accommodate. At Axiomatics, we always welcome input from potential customers on their most challenging authorization scenarios to see how they we can meet their requirements.

Biographies for James and Gerry:

James McGovern

James McGovern is currently employed by a leading consultancy and is responsible for defining next-generation solutions for Fortune enterprises. Most recently he was employed as an Enterprise Architect for The Hartford. Throughout his career, James has been responsible for leading key innovation initiatives. He is known as a author of several books focused on Enterprise Architecture, Service Oriented Architectures and Java Software Development. He is deeply passionate about topics including web application security, social media and agile software development.

James is a fanatic champion of work/life balance, corporate social responsibility and helping make poverty history. James heads the Hartford Chapter of the Open Web Application Security Project (OWASP) and contributes his information security expertise to many underserved non-profits. When not blogging or twittering, he spends time with his two kids (six and nine) who are currently training to be world champions in BMX and Jiu-Jitsu.

Gerry Gebel, President Axiomatics Americas

As president, Gerry Gebel is responsible for sales, customer support, marketing, and business development for the Americas region. In addition, he will contribute to product strategy and manage partner relationships for Axiomatics.

Prior to joining Axiomatics, Gebel was vice president and service director for Burton Group’s identity management practice. Gebel authored or contributed to more than 70 reports and articles on topics such as authorization, federation, identity and access governance, user provisioning and other IdM topics. Gebel has also been instrumental in advancing the state of identity-based interoperability by leading demonstration projects for federation, entitlement management, and user-centric standards and specifications. In 2007, Gebel facilitated the first ever XACML interoperability demonstration at the Burton Group Catalyst conference.

In addition, Gebel has nearly 15 years experience in the financial services industry including architecture development, engineering, integration, and support of Internet, distributed, and mainframe systems.

Explore posts in the same categories: Architecture, Authorization, Standards, XACML

2 Comments on “Part Three: Enterprise Authorization Scenarios with James McGovern”


  1. […] Enterprise Architect for HP focused on insurance and I held several discussions (Part 1, Part 2, Part 3) on using entitlements management within the insurance vertical. Now that we are in a new year, we […]


  2. […] Enterprise Architect for HP focused on insurance and I held several discussions (Part 1, Part 2, Part 3) on using entitlements management within the insurance vertical. Now that we are in a new year, we […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Follow

Get every new post delivered to your Inbox.