Underpinning conjoint analysis design is the use of a fractional factorial design to simplify the number of profiles that need to be shown, whilst maximising the amount of statistical information that can be collected. Design efficiency (d-efficiency) is one way of looking at the quality of the design, but researchers should also look at the design from the respondent’s point of view.

A fractional design is needed because a ‘full-factorial’ design becomes extremely large as the number of levels and attributes increases. A design with 4 attributes each with 3 levels requires 3x3x3x3=81 designs for a full factorial design. A fractional factorial design can reduce this down to 9 designs.

The initial theory of conjoint analysis used a method called an ‘orthogonal array’ to determine the most efficient design. The benefit of an efficient design is that you get a better estimate of parameters (ie part-worths, betas or utilities) from the smallest possible sample. In other words a better design allows you to reduce the sample size required for a given level of accuracy.

The choice of fractional-factorial method directly impacts the potential efficiency of the design (via the information matrix of the covariances in the design). A design does not have to be orthogonal to be efficient and some efficient designs are not orthogonal.

However, there are two competing schools of thought emerging about overall design. From the economics field, researchers who use ‘stated preference’ or ‘discrete choice’ experiments (alternate names for forms of choice-based conjoint) have been experimenting with different ways of generating designs. For these researchers, conjoint is typically analysed through a Maximum Likelihood Estimation (MLE) method – typically a logit model across a sample as a whole and so the design is extremely important in determining parameter estimates. These researchers are developing algorithms to create what are known as D-efficient design. However, evidence suggests that even with a D-efficient design which theoretically can analyse parameters with just a sample of a handful (under 10), in practice at least 20-30 observations are actually required.

The second stream of thought comes from the market research community, typically led by Sawtooth. With the advent of online research, it has become a lot easier to run research with larger sample and multiple designs. The market researchers have also observed that some of the efficient designs are not necessarily ‘respondent-friendly’ as one task is unconnected with the answers to any other task. Secondly, market researchers are increasingly using hierarchical bayes as an estimation method in order to estimate individual level values (eg for use in segmentation or drill-down analysis). As a result, the market research community is looking at designs which are not necessarily efficient, but may be adaptive. For instance, they see that allowing some overlap of levels to happen improves respondent decision making (reducing shortcuts and heuristics). Designs with overlap are naturally not statistically efficient, but if they improve respondent answer quality, then obviously they should be preferred. Inefficient designs should be disliked as they require large sample sizes for estimation. But online, larger sample sizes are easier and much more cost effective to achieve. Secondly, researchers are using many more designs than the economists (who typically are using a single or handful of designs across a sample). The use of multiple designs allows more combinations to be included and therefore covers a greater range of the combinatorial space.

The choice of design method will depend on the theoretical underpinnings, but for most practioners larger sample and many more designs are in use. However, in specialist markets (eg pharma) where samples are typically smaller, more care may be necessary about the design-efficiency of the model. For more information on Conjoint Analysis see our main site www.dobney.com.

## Leave a Reply