A gentle journey into LOWESS and B-spline with reasons why

Stock image from https://pixabay.com/photos/wires-computer-wire-power-wire-3702104/

We’ve all heard that information is beautiful. But then life happens, and we find ourselves trying to prettify the mess that often is reality. That’s par for the course in data wrangling, but it can also be so in visualization, and depending on what role that visualization is playing. Is it just exploratory spade work for a predictive model? Or an after-the-fact way of presenting the results? Or is it sometimes something more, for instance, depending on what our expectations of the data are.

If we’re considering something like Vitamin D serum levels fluctuating over months and years (below), we…

So far, I'm really liking this article on gradient descent. Bravo for giving both a clear overall picture _and_ getting into the math equations! In particular I love your analogy about being caught in a dark fog with only the ability to physically feel the direction of steepest descent (not being able to see the bottom of the valley)

Nevertheless the first code snippet (which is for Batch Gradient Descent) doesn't run as-is, because X_b isn't defined. Did you intend to include something like

X_b = np.c_[ np.ones(m), X]

right before the for-loop? (As you know, this upgrades the 1-D array to 2-D by prepending a columns of 1s)

With inclusion of this line, the code runs and seems to produce appropriate results.

The second in a series explaining lowess and B-spline smoothing

Part 2 is now consolidated into a single 6-part blog post published in Towards Data Science


Joshua Banks Mailman, Ph.D.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store