Different Ways Of Variable Reduction

Once upon a time, there was a teacher in a village he asked his students to narrate the importance of Subhas Chandra Bose in the fight for freedom of India. But he started from the name of parents of Subhas Bose and described his early life in details as well he also described his marital status and everything after that he said the importance of him as a freedom fighter of India. As a result, he took 40 minutes to describe the importance of Subhas Bose as a freedom fighter which was the main question the teacher asked. So the teacher advised his student after giving a short intro he should focus on his main topic and don’t waste his time giving importance on other irrelevant topics.

Similarly, there is the same possibility in Data Science field where we find the relationship between independent variables and dependent variables and we should give less importance on less important dependent variable and we should consider only the important independent variables which have an effect on the regression analysis with dependent variable using different methods.

Basically, the variable reduction process can be done in two ways:

  1. Feature selection
  2. Feature extraction

In Feature selection, we discuss

  • backward elimination
  • forward elimination
  • bidirectional elimination

and in Feature extraction, we discuss

  • Correlation Analysis
  • PCA
  • Exploratory factor analysis
  • Multicollinearity
  • Linear discriminate analysis
  • Wald chi-square method.

The variable reduction is a crucial step for accelerating model building without losing the potential predictive power of the data. With the advent of Big Data and sophisticated data mining techniques, the number of variables encountered is often tremendous making variable selection or dimension reduction techniques imperative to produce models with acceptable accuracy and generalization.

It may be noted that the following techniques are not used in the given order and moreover before going to take care of variable reduction we should emphasize more in univariate analysis of variables. To check the frequency distribution summary regression analysis and the most important thing is checking the missing value in any variable.

Feature Selection

In BACKWORD ELIMINATION during regression analysis the independent variables which are less important those variables are eliminated from backward direction.

In FORWARD ELIMINATION during regression analysis the independent variables which are less important those variables are eliminated from forward direction.

In BIDIRECTIONAL ELIMINATION during regression analysis the independent variables which are less important those variables are eliminated from both directions.

FEATURE EXTRACTION

In this case, we describe many methods of reduction of variables and we take care about dimension reduction also. Among the first is CORRELATION ANALYSIS. Correlation is the linear relationship between variables. Suppose we want to find a relationship between a hundred independent variables with one dependent variable.

For this, we create a correlation matrix. On the basis of correlation, we take those independent variables as explanatory variable among them which are highly correlated with the dependent variable. The sign of the correlation coefficient indicates the direction of association and it always lies between -1 (perfect negative linear association) and 1 (perfect positive linear association). A zero value of r indicates no linear relationship.

Now we discuss the correlation between two independent variables. A higher correlation coefficient (r) between two independent variables implies redundancy, indicating a possibility that they are measuring the same construct. In such a scenario, it would be prudent to select either of the two variables in the consideration or to adopt an alternative approach to selection which involves two most widely used techniques viz. Principal Component Analysis (PCA) and Exploratory Factor Analysis.

Now we discuss the next approach of variable reduction PRINCIPAL COMPONENT ANALYSIS (PCA) Principal Component Analysis is a variable reduction procedure and helps in obtaining a smaller number of variables called Principal Components, which account for most of the variance in the observed variables from a group of a large number of redundant (correlated) variables.

Suppose among 100 explanatory variables just 44 variables are highly correlated. Among 44 variables some variables are such type that the correlation among 3rd and 5th variables is 0.87 and that of between 3rd  and 8th variables is 0.85. So this correlation is highly significant in this case PCA is necessary. Principal Component Analysis can be performed on a set of correlated variables to obtain a new composite variable (Principal Component) which will have the properties of all the variables in question.

Linear combination of optimally-weighted variables under consideration and can be used for subsequent analysis. One can compute as many principal components as the number of independent variables which can be further analyzed and retained on the basis of the variability explained by them.

Now we discuss the important variable reduction approach Exploratory Factor Analysis is also a variable reduction procedure, similar to Principal Component Analysis in many respects but the underlying procedure for both the techniques remain the same but there are conceptually dissimilarities between this two method which will be explained here.

Factor analysis is a statistical technique concerned with the reduction of a set of observable dependent variables in terms of a small number of latent factors. The underlying assumption of factor analysis is that there exists a number of unobserved latent variables (or “factors”) that account for the correlations among observed variables, such that if the latent variables are partialled out or held constant, the partial correlations among observed variables all become zero.

In other words, the latent factors determine the values of the observed variables. The term “common” in common factor analysis describes the variance that is analyzed. It is assumed that the variance of a single variable can be decomposed into common variance that is shared by other variables included in the model, and unique variance that is unique to a particular variable and includes the error component. Common factor analysis (CFA) analyzes only the common variance of the observed variables; principal component analysis considers the total variance and makes no distinction between common and unique variance. The selection of one technique over the other is based upon several criteria.

Next, we look at MULTICOLLINEARITY which occurs when independent variables are highly correlated among themselves.

Now we discuss another popular method of variable reduction, Wald Chi-Square. The Wald Chi-Square test statistic is the squared ratio of the Estimate to the Standard Error of the respective predictor.

Now we discuss another method of variable reduction i.e. Linear Discriminant Analysis that also works as a dimensional reduction algorithm, it means that it reduces the number of dimension from original to C — 1 number of features where C is the number of classes. In this example, we have 3 classes and 18 features, LDA will reduce from 18 features to only 2 features. After reducing, neural network model will be applied to classification task.

Executable Code:

Now we will work with the Model

 

 

 

Variable reduction techniques - StepUp Analytics

 

 

 

 

 

From the above entire discussion and example, we really realize the importance of variable reduction properly and implement it on the real-life example. Because sometimes this type of a situation takes place to predict a single variable we take 80 to 90 independent variables some them mean same thing or some of them are also irrelevant in predicting the dependent variable also and some of them has no relationship with the dependent variable as a result prediction can’t be correct and expenses will be large. For all these reasons reduction of variables is so much necessary simultaneously different methods of reducing variables should be adopted carefully and stepwise. Now we

You might also like More from author