Correlated Component Regression
Our CCR methodology is revolutionising predictive model development, especially in challenging datasets with many correlated predictors and small samples.
Correlated Component Regression is a form of constrained (regularised) regression. It was originally designed for analysing data with too many highly correlated predictors and too few observations, which is becoming increasingly typical in the survey research business.
It is revolutionising predictive model development and is used widely in our consultancy work. Although originally designed to cope with challenging datasets we have also found in practice that it outperforms standard regression on most data sets.
It has a number of unique features:
- Creates an optimal number of correlated components from raw predictors; these components are composite predictors which are defined as weighted sums of the raw predictor scores for each case. Finding the optimal number of components gives very stable prediction across different samples unlike conventional regression.
- Uses a unique crossvalidation algorithm to choose models with optimal number of predictors and components based on performance on “hold-out” cases;
- Unique “stepping down” algorithm to select the optimal numbers of predictors
- Underlying algorithm can be modified to any family of model. From Linear models, through to logistic, LDA, Survival Analysis, Ordinal Logistic and Multinomial modelling. Many of these models are already implemented and we are in the process of extending to the rest.
The net result is that the final models are selected based on their ability to predict beyond the sample to new cases. The outputs produced by the procedure are similar to those of other regression programs, with slopes (the effect size) presented for each predictor.
Please contact us for further information.