PSGroup Bull's-Eye: Ecommerce Search Solutions

Featuring Celebros, Endeca, Fredhopper, Mercado, and SLI Systems

July 24, 2008

Our detailed review of five leading ecommerce search offerings from Celebros, Endeca, Fredhopper, Mercado, and SLI Systems has produced both a side-by-side comparison across our 230 criteria, and also our ranking of the solutions based on performance against our criteria. The real bottom line is that each of these solutions is well worth considering. All five products performed very strongly.

NETTING IT OUT

How do the five leading e-commerce search engines stack up? Between December 2005 and March 2008, our search and findability expert, Susan Aldrich, reviewed five leading ecommerce search solutions—from Celebros , Endeca , Fredhopper , Mercado , and SLI Systems —using her in-depth Enterprise Search Planning and Evaluation Framework. In this Bull’s-Eye comparative ranking report we evaluate and rate the five products according to how well they addressed the 230 criteria in our framework. What surprised us most is how well all five solutions address all of Sue’s criteria. There were only about 65 criteria where the solutions differed.

The report detailing our findings and presenting our enterprise search evaluation framework, now in Version 3 , is available at no charge at our Web site. The lone category where criteria were missed by all products was Marketing Management. None of the products supports defining product bundles via the merchandiser interface. In the area of integrated marketing, none of the products provide integrated reporting of end-to-end search activity, conversion, and ROI. Only one vendor, SLI Systems, provides tools to manage paid search campaigns based on site search activity.

We apply our Customer Scenario ® methodology to determine the requirements on which we base our evaluation criteria. We see two distinct groups of people whose success depends on search capabilities. The first consists of seekers, those people who are your customers, partners, employees, and other stakeholders. The second group consists of information owners who have a variety of goals, from quality of customer experience to cost cutting to relationship deepening. We address the goals and metrics of these stakeholders in our analysis of the planning and evaluation criteria for enterprise search solutions. The detailed evaluations of each product are available for free to subscribers and can be purchased and downloaded by non-subscribers.

The comparison spreadsheet containing our evaluations of these five products against all 230 criteria is also available to subscribers or available for purchase and download. We provide this comparison in editable form so that you can add your own evaluations and/or assess other products (or your own home-grown solution) against our criteria.

The solutions themselves vary in age from 4Q2005 to 1Q2008. Not surprisingly, the oldest solution performed most poorly. Surprisingly, the newest did not rank the highest. Fredhopper Access Server V 6 and Endeca IAP V.1 tied for first place. Mercado 4 and SLI Systems Learning Search January 2008 tied for second place. Celebros Salesperson V4 came in fifth. But, to be fair, the Celebros version we reviewed is not the latest version, which would have ranked better. We hope to update our oldest in-depth evaluations before the end of the year and will re-rank these products as appropriate. Also, bear in mind that SLI Systems is an SaaS offering, while the other four products are available in both in-house installable and hosted versions, so there is an apples and oranges aspect for some of the criteria.

An important caveat to remember in reviewing this report is that two of the most important criteria are beyond our capacity to measure, since there are no acceptable benchmarks: the effectiveness of the retrieval and ranking functions and the scalability of the solution. So you’ll need to run your own trials using your own e-commerce content to test and to tune search effectiveness and scalability.

OVERVIEW OF THE EVALUATION PROCESS

During the past 18 months, I evaluated five ecommerce search solutions using my Enterprise Search Evaluation Framework. This report analyzes a detailed, side-by-side comparison of 230 framework criteria for each of five products. The five products are:

It took me too long to wrap up this project. There were a few delays beyond my control, a few delays I reluctantly agreed to, and a few that were my own fault. Today I am ruefully facing the music: because of the elapsed time, two of the products on this list have a new version number and the comparison is therefore somewhat dated. But I think it’s time to publish and be damned. Literally, perhaps, in this case, as people involved in developing the products I review here will no doubt have a few choice words for me, including suggestions for my future residence. It is my goal to update the product reviews and publish a new comparison by year end, mollifying the currently enraged but enflaming the currently mollified. I hope I’m not too optimistic: achieving my target requires the commitment of the vendors involved, and their current good intentions must survive the pressures of year-end sales cycles.

Keep in mind as you read through this report that the age of the products reviewed spans two years, a very long time in the rapidly moving search space. Table A contrasts the product versions I reviewed against the current versions.

Version Summary
Pllease download the PDF to see the table.
Table A. This table presents the versions that are reviewed in this report, compared to the current and planned releases.

Step 1: Reviews of Each Product

Using my evaluation framework, over the past 18 months I produced five detailed product reviews that assesses how the solution addresses each of the framework criteria. The individual, detailed reports analyzing each product are available on our Web site.

Step 2: Side-by-Side Comparison

My esteemed colleague, Fanny Wong, pulls the tables from each of the individual reports to create an Excel chart with a side-by-side comparison of the 230 criteria for all five products.

Performance by Product

Performance by Product

© 2008 Patricia Seybold Group Inc.

Illustration 2. Each solution’s performance in the five major technical categories is presented in this chart. The total points possible in each technical category are provided next to the category name.

Step 3: Scoring and Ranking

I used a simple system of allotting one point per criterion, where possible. Not all criteria are designed such that they can be met or measured. For example, “What is the pricing model for the product” is an example of a framework element that is not scored. The products with more points have the higher ranking...

 


Sign in to download the full article

0 comments


Be the first one to comment.

You must be a member to comment. Sign in or create a free account.