Weighing in on Pull vs. Push

Posted August 20, 2010 by ggebel
Categories: Authorization

Bob Blakley certainly hit a nerve with his keynote presentation at Catalyst this year. He had been working on the concepts for his “Pull” identity architecture for some time and it was well received by the audience, sparking a lot of discussion and debate. Since the conference, we’ve witnessed a terrific continuation of the debate through the excellent posts by Nishant Kaushik and Ben Goodman. Nishant has argued in favor of “Pull” here and here, while Ben has taken an opposing view here and here.

This type of discussion often takes place when we speak to enterprises about adopting externalized authorization managers instead of relying on historical approaches – you don’t always (or rarely) open up legacy applications unless there is a specific business reason to do so. However, as Nishant points out, enterprises realize the value and opportunity of moving forward with a “Pull” based approach. Existing models, while workable in many situations, may not be flexible enough for modern business organizations that need to operate in a more dynamic fashion while maintaining security and regulatory compliance.

A final point to make is this: for every application that adopts the “Pull” model, you have one less application that requires provisioning or data synchronization. I refer to this type of application as stateless, from an identity perspective. In this case, users don’t authenticate to the application, they authenticate via a service that may be hosted by the enterprise or an external entity – no extra accounts or credentials needed. For access control, the application calls to an externalized authorization manager (EAM) – here policies define what the user can do within the application. If additional attributes are needed, they can be loaded from existing authoritative sources by the EAM – no extra data synchronization or user provisioning is needed. Now this model will not work for every application or every scenario, but it is a model that is implementable today and many in the industry are enthusiastically adopting it. For applications that still require a monolithic approach, then I agree with Ben that your IdM  tools must indeed be very intelligent.

Advertisement

Instigating Again – XACML 3.0 Interop

Posted August 4, 2010 by ggebel
Categories: Standards, XACML

One of the points I made during my vendor lightning round session at Burton Catalyst last week was that the industry should be looking ahead to an XACML 3.0 interop in 2011, perhaps at the next Catalyst conference. Catalyst was the site of the first ever XACML interop demonstration back in 2007 and would be a great venue again next year. It is expected that more vendors will adopt version 3.0 once OASIS completes formal standardization (currently a committee draft and will shortly be voted on as committee standard).

There are some basic usage scenarios that can be tested, such as implementing policies authored in one vendors PAP in another vendor’s PDP. Another scenario that is frequently mentioned is integrating a PDP with other vendor’s PEPs. What scenarios are most important to you?

Another item to consider is whether the industry needs certified conformance testing of XACML products. This capability has been very valuable to the federation market, but there is a lot of ambiguity today for externalized authorization manager products. If vendor products were certified as conformant by independent party, would that be valuable to you?

Finally, interoperability and standards conformance are more important than ever for the externalized authorization manager market. Demand is increasing from enterprises, SaaS vendors, cloud service providers, and others. These prospective implementers of XACML-based solutions must be confident of the functionality supported in commercial products and they should have a clear understanding of interoperability capabilities. That is why we are calling on other XACML vendors to join us in planning for the next interop event and also to seriously consider sponsoring a certification process.

image credit: http://instigate.co.uk/InstigateCop.jpg

Return to Catalyst

Posted August 2, 2010 by ggebel
Categories: Conferences

Last week marked my first visit to a Catalyst conference since departing from Burton Group earlier this year. Let’s just say it is a LOT more relaxing to be there as an attendee and speaker than as part of the production team!!

I found the latest Catalyst to be informative, entertaining, and it exuded a high level of energy – just what you want in a conference. In the identity management sessions, I appreciated the focus on externalized authorization, virtual directories, and federation. The Concordia workshop on authorization was well attended and showcased progress made in a number of areas in the recent past. The workshop also highlighted some areas where the industry can focus energy, such as:

Burton Group has a great formula for the Catalyst Conference and apparently Gartner agrees since Catalyst 2011 in San Diego was announced last Thursday. I plan to be there, how about you?

Diagramming XACML Performance

Posted July 14, 2010 by ggebel
Categories: Authorization, Performance, XACML

In a previous post discussing XACML performance myth-busting, I described several areas in an XACML authorization system where performance issues can be addressed. Since then, my colleague David Brossard created the diagram below to illustrate potential performance bottlenecks.

To refresh your memory, here is the issue for each numbered item in the diagram (see the previous post for explanations):

  1. Policy Retrieval
  2. Policy Matching
  3. Attribute Retrieval
  4. Decision Caching
  5. Multiple Requests
  6. PDP – PEP Interaction

Concordia hosts Authorization Standards Workshop

Posted July 9, 2010 by ggebel
Categories: Authorization, Standards, Workshop

The Concordia Discussion Group is planning another workshop at Burton Catalyst North America, continuing a trend of providing timely and informative events. I have had the pleasure of participating in the past and will provide an update on what is new in XACML 3.0 this time around. XACML 3.0 is nearing ever closer to formal standardization – and contains several useful enhancements that are important for leading edge as well as legacy application environments.

Information on the workshop can be found here. Admission is free – you just need to register with Dervla O’Reilly to attend. Hope to see you there!

The Anywhere Application Architecture

Posted June 8, 2010 by ggebel
Categories: Authorization

With this post, the long tail of commentary from the European Identity Conference continues. I came up with the term Anywhere Application Architecture while preparing my EIC keynote as it captures a number of principles that architects must consider when deploying applications. Today, applications and the supporting infrastructure must be able to run anywhere – and be mobile enough to transit between on premises data centers, private clouds, and public clouds. Below is a graphic that represents this approach, from an authorization perspective.

Here’s how it works: A typical authorization service is depicted in the lower part of the graphic; it is comprised of a policy enforcement point (PAP) policy decision point (PDP) and policy enforcement points (PEP). In addition, authorization policies represent the business and security rules to be enforced and necessary attributes are retrieved through a policy information point (PIP) interface.

The traditional deployment of an authorization infrastructure is to install it on premises alongside applications in your enterprise’s data center. A PEP would be typically integrated with your application (App A in this case). If your enterprise utilizes a private cloud hosting a web services application, then XML gateways can serve as a super PEP to secure access to web services (in this case Service A).

If you are running workloads in the public cloud, the same authorization infrastructure can be extended. In our example, the XML gateway can protect publicly hosted web services (Service B) or you can choose to implement a PEP in the cloud (Service C). Finally, you may also choose to run part or all of your authorization infrastructure in the cloud – depending on the usage scenarios or requirements of your applications and users.

To reiterate, security architects and application planners must prepare for workloads that can run in the data center, in private clouds or in public cloud scenarios – and they must be able to accommodate moving workloads between these environments. Therefore, your IdM infrastructure must have the same flexibility characteristics. In this example, we’ve shown you how an XACML-based authorization system fits the bill. By the way, this approach integrates extremely well with a federated model as the authentication approach. Then you can also accommodate users that are located anywhere!

Why are security phrases a bad idea?

Posted May 19, 2010 by ggebel
Categories: KBA

The title of this post was the security pass phrase question I chose when registering for online access to a financial services firm. I called aforementioned financial services firm today and when prompted for the answer by the customer service representative, I responded “because you can’t remember the questions or answers.” Unfortunately, that was not the correct response to this knowledge based authentication (KBA) question. However, this experience is another example of why static KBA systems are a bad idea – usability. I registered for access so long ago that I can’t remember the response (I usually include them in my password safe, but did not in this instance). Being an identity-privacy zealot, I did not enter one of the usual KBA questions like, what was the first car you owned or where did I go to high school.

Why are static KBA systems still in use? They are a very weak link in the security chain, but used by so many web sites including, supposedly, security conscious banking sites. In the age of Google and Facebook, the list of good KBA questions is effectively zero. I will be happy when my account is closed with this financial institution!

Small Vendor “Risk”

Posted May 6, 2010 by ggebel
Categories: IdM Market

What is the real risk of choosing a small, innovative vendor for your critical IT projects? On the one hand, you can purchase a product built by a vendor that specializes in a specific functional area, is passionate about customer success, and adjusts priorities to meet your particular business requirements. On the other hand, you can choose to purchase an average product, built by a vendor that has a product list bigger than the national debt, is passionate about their profitability, and pressures customers to buy into a grand vision.

What choice do you make? Which option will ensure the success of your business objectives?

Stand Up For Standards

Posted May 6, 2010 by ggebel
Categories: Standards

Andre Durand’s keynote yesterday at the EIC conference contained many quotable quotes, but the one that stood out for me was, “Enterprises must stand up for standards.” Andre said this while describing the role pure play vendors have in the greater IT community – keeping the large vendors honest regarding their commitment to standards.

IdM standards offer the promises of interoperability, independence from vendor lock-in, and future-resilient systems. Does your vendor have the same perspective?

Enterprises can let vendors off the hook if they don’t inquire, specifically, as to how a product implements a particular standard. Do you ask, in advance, detailed questions about how completely a standard is supported? Does your vendor avoid directly answering these questions, or are they forthcoming?

During product evaluations, there are times when purchasers can compromise on any number of issues. Let’s face it; there are no perfect products available. And, no vendor has anticipated every use case. Deadlines are looming, we have to make a decision.

Ultimately, enterprises hold the power and influence in their budgets. Are vendors serious about their commitment to identity management standards? The answer can be found in your purchase order.

Some Business Stakeholders Might Want Your IT Project to Fail

Posted May 6, 2010 by ggebel
Categories: IdM Projects

What?! That was my response when hearing Dr. Rainer Janssen, CIO of Munich Re deliver his keynote presentation at the EIC conference this week. Dr. Janssen offered many perspectives that I had not heard or thought of before. Many an article has been written and presentation offered that discusses “alignment” of business and IT. Most of these supposed tenets were turned upside down or inside out during a very interesting presentation.

Some business stakeholders might want your IT project to fail? Really? I thought we invited stakeholders to IT projects to make them succeed. The lesson here is; make sure you know the motivations of stakeholders that you invite to the project. The worst case is not knowing they might be pulling against you – until it’s too late.