Jump to content

Localized Regression


hrothgar

Recommended Posts

And I thought from the title you were talking about another phenomenon where creatures evolve something, and then unevolve it because of local conditions.

 

eg Santa Catalina island rattlesnake which has lost its rattle because there are no large mammals to warn off.

 

Interesting article.

Link to comment
Share on other sites

MATLAB has a localized regression implementation that works with two independent variables.

 

In theory, you should be able to extend the technique to cover N independent variables.

In practice, LOWESS requires a KNN search operation and KNN search slows down a lot with large numbers of dimensions.

 

Most people seem to switch over to boosted or bagged decision trees for large numbers of N.

Link to comment
Share on other sites

Richard - As you know, econometrics is a huge field in economics. Does the localized regression technique fit into any of the known econometrics techniques that have been discussed in academic papers? I was just surprised to see the blog without any reference to academic literature on the subject. If it is truly a novel technique, I am surprised that no one has attempted to publish it.
Link to comment
Share on other sites

Hi Matt

 

I wouldn't characterize anything in the blog post as novel. I was simply trying to provide a useful illustration of localized regression and the bootstrap.

 

It's pretty easy to find references about the application of this techniques within Economics for low dimensional spaces.

(Once you hit higher numbers of dimensions boosted and bagged regression trees are preferred).

 

There's also a lot of debate about the relative merits of localized regression versus smoothing splines in low dimensional spaces.

 

For example, the following piece preferred localized regression

http://www.cs.pitt.edu/~chang/265/sispaper/sisref/3.pdf

 

while

http://www.polisci.ohio-state.edu/faculty/lkeele/TOC.pdf

 

argued that smoothing splines are preferrable

Link to comment
Share on other sites

no its not a new idea, it has been around for at least a couple of decades.

 

Lo(w)ess is linear (or quadratic) regression where one model is fitted for each value of the independent variable, in that the points used for regression are weighted on the basis of their distance to the point of interest, using a dynamic-bandwidth kernel. It is some weird kernel the motivation of which I don't quite understand but I suppose the inventors had their reasons :)

Link to comment
Share on other sites

btw with many covariates my first try would be with a some kind of penalized regression. What about penalized lowess? I think I will implement that and see if it works.

 

Last week we had a guest speaker from New Zealand who had developped kernel smoothing specifically for estimation of local odds, i.e. you have two processes X and Y which you could fit by Lowess or such. Rather than fitting the two independently and then compute odds, he fitted the local log-odds directly. I found that interesting although of course a very specific thing.

Link to comment
Share on other sites

I don't know whether this is relevant but possibly it is of interest. Someone referred me to a letter by Brad Efron in the NYTimes. Apparently there was a research article on ESP that has provoked discussion about the use of statistics in the social sciences.

 

The link to Efron:

http://www.nytimes.com/2011/01/18/science/18letters-PROCRUSTEANC_LETTERS.html?_r=1&ref=science

 

His letter links to an article in the Times:

http://www.nytimes.com/2011/01/11/science/11esp.html

 

In the "small world" department, I wrote a term paper on esp when I was in high school (when Rhine at Duke was a big cheese), and I sort of knew Brad Efron then. He went to St. Paul Central, I went to St. Paul Monroe. My friend Joe Auslander pointed me to Efron's letter because he knew of this (weak) high school link. Richard's blog mentions resampling, and that relates back to Efron.

 

I haven't the time right now, but I am interested in Richard's blog and intend to get to it.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...