Customer Portals Research Findings

Refining Our Evaluation Framework

May 4, 2006

We have completed the product evaluation phase of our research on customer portals, having evaluated seven portal platforms against the criteria of our framework. Along the way, we have refined our framework to match customer requirements, to reflect portal platform capabilities, and to improve the evaluations consistency, accuracy, effectiveness, and usability of our evaluations. The refinements summarize the key findings of our research and identify the key differentiators among the products. We present and discuss those refinements in this report.

NETTING IT OUT

We have completed the product evaluation phase of our research on the use of portal technology platforms for customer portals. Specifically, we have evaluated these platforms against our framework:

* BEA WebLogic Portal
* IBM WebSphere Portal
* Liferay Portal
* Microsoft SharePoint Services and SharePoint Portal Server
* Oracle Application Server Portal
* SAP NetWeaver Portal
* Vignette Portal

Along the way, we have refined our framework to match customer requirements, to reflect portal platform capabilities, and to improve the evaluations consistency, accuracy, effectiveness, and usability of our evaluations.

Before we begin the next phase of our research, comparing the platforms, we’d like to present and explain the refinements to our evaluation framework and to discuss the rationale that we used to make them. The refinements summarize the key findings of our research and identify the key differentiators among the products.

RESEARCH ON CUSTOMER PORTALS

For the past several months, we’ve been busy with research on customer portals. As is typical with our software product research, we follow these research phases:

* Specification of evaluation framework
* Product evaluations
* Product comparisons
* Best practices and case studies

We published our evaluation framework just about a year ago on May 12, 2005. Since that time, we’ve published our evaluations of these seven portal technology platforms against that framework:

* BEA WebLogic Portal
* IBM WebSphere Portal
* Liferay Portal
* Microsoft SharePoint Services and SharePoint Portal Server
* Oracle Application Server Portal
* SAP NetWeaver Portal
* Vignette Portal

Refining Our Evaluation Framework

Now, we’re just about ready to compare the platforms. But before we do, we want to re-publish the evaluation framework because, as we’ve been doing our product research, we’ve refined the framework. We have been true to the framework in our evaluations. Each product evaluation has our assessment against every criterion in the framework. However, to compare the products most effectively, we’d like to make a few refinements to those criteria and to let you know exactly what those refinements are and to explain why we’re making them. We feel that these refinements will make our comparisons easier to understand and more actionable. Briefly, listed below are the top-level evaluation criteria of the framework and the refinements that we’ve made to them to facilitate product comparisons:

* PORTLETS. Change emphasis on architecture evaluation.

* CUSTOMER PROFILE MANAGEMENT. Drop. This criterion doesn’t differentiate.

* PROCESS MANAGEMENT SERVICES. Drop. This criterion doesn’t differentiate.

* CONTENT MANAGEMENT SERVICES. Expand to treat UI content and document content separately and to include evaluation of taxonomy.

* SEARCH. Expand to include evaluation of taxonomy. Move evaluation of navigation to the evaluation of UI content.

* PERSONALIZATION SERVICES. No change.

* COLLABORATION SERVICES. No change.

* ANALYTIC CAPABILITIES. Contract to evaluate all analytics within a single evaluation criterion.

* ARCHITECTURE. No change.

* PRODUCT VIABILITY. No change.

* COMPANY VIABILITY. Drop. This criterion doesn’t differentiate.

Refinement is an apt term to describe the changes that we’ve made. Our initial proposition that portals were essentially content and services has been validated. Portlets and content management services were initially our top two evaluation criteria and they remain so after refining the framework. The most significant changes are in the areas of customer profile management, analytic functionality, and company viability.

A Continuous, Feedback Loop Process

We never view our initial specification of a framework as inviolate. We expect to change our frameworks over time as technology evolves, as requirements for technology adoption and application change, and as our understanding of these factors improves. In the same manner as installations follow a continuous, feedback loop process to analyze and refine their technology implementations, we continuously analyze and refine our evaluation frameworks.

Refinement Details Summarize Research and Analyze Key Differentiators

The eleven tables below, one for each of the top-level evaluation criteria, present the details of the refinements to our evaluation framework for customer portals. In each table, we describe the criterion and its sub-criteria, discuss our research findings, and describe the refinements resulting from the research. The research findings provide useful information in their own right. They’re summaries of our research and analyses of key differentiators.

Please download the PDF version to see the tables....

 


Sign in to download the full article

0 comments


Be the first one to comment.

You must be a member to comment. Sign in or create a free account.