Conjoint analysis (or discrete choice estimation/stated preference research) broadly has four main components. The attributes and levels that make up the product or service that we want to test, a statistical design to choose combinations of attributes and levels in order to convert them into product profiles that reflect the decision space, a choice method – usually direct choice but it could include an estimation of volume (eg number of prescriptions for medical subjects), a ranking, a multiple selection eg of items into a consideration set, or a rating in a more old fashioned conjoint, a method of analysis – normally Hierarchical Bayes or MLE and a modelling. Of these, the element that is most troublesome is the statistical design. With large numbers of attributes and levels, it is impossible to test all combinations, so we have to choose a subset (a fractional factorial design).This design process is a little obfuscated if you try to follow the literature or work out what is going on. In general, most academic papers, particularly those from earlier on in the history of conjoint analysis were interested in small designs because in general they would use a handful of designs (sometimes just one) and they wanted the design to be as small and efficient as possible.

These small number of designs would be taken from the statistics of experimental design – what are known as orthogonal designs or orthogonal arrays. The aim being to have the smallest manageable combination of potential profiles to test with respondents. So that rather than showing thousands of combinations, a much smaller number – typically less than 30 could be shown – knowing that the statistical analysis at the end would be able to separate out the main effects from the design.

What researchers were looking for was a design that was balanced – in other words each level in each attribute appears the same number of times – and one which was orthogonal – that is if you take a pair of levels, one from one attribute and one from another attribute, the pairs appear the same number of times in the design.

For example for a 3 x 2 you might have

- 1-1
- 1-2
- 1-3
- 2-1
- 2-2
- 2-3

Then each of these pairs should appear the same number of times in the design.

These designs are relatively complex to come up with and the subject of a lot of academic work. There are large libraries available from SAS and a number of sets of algorithms for generating orthogonal designs.

One problem with orthogonal design for conjoint is that in conjoint analysis we are typically looking at attributes each with 4-5 different levels. Many many published orthogonal designs are designed for smaller scale industrial and lab research where attributes have 2, or possibly 3 levels. So many published designs are based on some combination of 2^n potential factors. These don’t work so well for a broader marketing study – though there are academics who would tend to prefer 2 or 3-level attributes only.

Where you have attributes with several levels, the orthogonality criteria tends to increase the size of the orthogonal set very rapidly. In a design of one attribute with 2 levels, and the second with 3 levels we must have at least 2 x 3 = 6 profiles so as to have each combination of the first and second attribute. If we add another profile of size 2. The smallest number of profiles will be the lowest common multiple of 2 x 3 (first pair), 2 x 3 (second pair) and 2 x 2 (third pair). In other words 12 which is a multiple of 6 and 4 is the smallest possible orthogonal profile.

For attributes with larger numbers of attributes, this means the orthogonal set grows large quite quickly and may, at times, be the same as the full factorial solution. Eg 3 x 3 x 4 would need as size which is divisible by (3 x 4) 12 and by (3 x 3) 9 = 36 – the same size as the full factorial option. This is in contrast to say an apparently larger 3 x 3 x 3 x 3 design where actually the minimum design size is potentially (and is actually) 9.

For a conjoint designer it can be complicated to come up with one perfect design. In addition, a single design may be too large for a single respondent. Orthogonality is also not the only criteria we might use and often researchers look at D-Optimal designs which aren’t completely orthogonal but are high quality and maximise the information obtainable from the design. (And for designs specifically for choice modelling using logit analysis, orthogonality may not be the best design criteria – S-design is an alternative).

Typically in choice-based conjoint someone answers in the region of 10-12 choice tasks. But if the design is actually 60+ profiles that need to be tested, the way the researcher works this is by spreading the 60+ profiles across a number of respondents. Effectively this means that each individual sees part of the full orthogonal set – in other words for an individual respondent the choices they are offered often aren’t orthogonal.

One of the big differences though between conjoint market research and standard experimental design is that in survey research we are able to make many more observations. A scientist in a lab will want the minimum design because he or she only wants to carry out the experiment once in the minimum amount of time (and cost). In survey research though we potentially have thousands of respondents each of whom can see a different profile set. Obviously we want to minimise the workload per respondent and avoid potential bias, but this means we aren’t limited to fielding a single design. We can use multiple designs and much larger design sets as a result.

We still want to have balance and orthogonality because it’s important that in the profiles we show we don’t confound the design for instance by always showing one level with another level which is very popular. For this reason some design strategies like those used by Sawtooth are based more on taking subsets of the full-factorial design (complete enumeration) creating questionnaire versions that span the range of possible options so no single design is perfectly orthogonal – they’re typically too small – but as a set across all respondents the design is orthogonal or near-orthogonal by virtue of spanning the entire full-factorial space. The use of large samples also allows the number of choice tasks per respondent to be minimised – for instance only asking 4 tasks per respondent to make the questionnaire much more manageable (this limits HB study of individualised utilities as a downside).

One of the more challenging directions for conjoint though is in creating more dynamic and larger choice tasks. Conjoint is typically limited to a smaller number of attributes (typically less than 8) and the problem with larger designs in both the potential number of combinations to be shown, and in information overload for the respondent.

However, if you look online at sites such as insurance comparison shopping sites, there are often many more what we would call attributes and levels that would be considered possible for conjoint analysis. A dynamic conjoint task is one that allows the respondent to use tools like search and filters to build consideration sets and ultimately select preferred options.

As we can’t tell what the respondent will use to search and filter, the profiles need to be created more dynamically allowing attributes or levels to disappear, or be recombined to match the broad set of options we’d like to show to make the study as realistic as possible. A pre-prepared orthogonal design is optimal and efficient, but if the number of attributes and levels changes (eg because we discard attributes or levels because the respondent chooses to filter or sort them out, or because the respondent adds a criteria), an original fixed design will break. For bigger and more flexible conjoint tasks the strategy of creating dynamic designs which respond to changes in the parameters becomes important.

## Leave a Reply