Example: For every 10% increase in the independent variable, our dependent variable increases by about 0.198 * log(1.10) = 0.02. For x percent increase, multiply the coefficient by log(1.x). For every 1% increase in the independent variable, our dependent variable increases by about 0.002. This tells us that a 1% increase in the independent variable increases (or decreases) the dependent variable by (coefficient/100) units. Only independent/predictor variable(s) is log-transformed.For every one-unit increase in the independent variable, our dependent variable increases by about 22%. This gives the percent increase (or decrease) in the response for every one-unit increase in the independent variable. Exponentiate the coefficient, subtract one from this number, and multiply by 100. Only the dependent/response variable is log-transformed.
OK, you ran a regression/fit a linear model and some of your variables are log-transformed. Then we’ll dig a little deeper into what we’re saying about our model when we log-transform our data. How do we interpret the coefficients? What if we have log-transformed dependent and independent variables? That’s the topic of this article.įirst we’ll provide a recipe for interpretation for those who just want some quick help. Let’s say we fit a linear model with a log-transformed dependent variable. But while it’s easy to implement a log transformation, it can complicate interpretation. Yet another is to help make a non-linear relationship more linear. Another reason is to help meet the assumption of constant variance in the context of linear modeling. If we’re performing a statistical analysis that assumes normality, a log transformation might help us meet this assumption. Why do this? One reason is to make data more “normal”, or symmetric.