4 views (last 30 days)

Show older comments

I have 30 data sets. Each data set corresponds to a different value of eta. So data set 1, consisting of x values and y values, is for eta=0.2, then data set 2, consising of x-values and y-values, is for eta=0.25. Does MATLAB offer a manner in which I can fit all of these simultaneously, where the solution is in the form of a polynomial, and it is a function of eta, x, and will have some fitted coefficients, so that each curve is fit properly?

Thanks for your input.

Matt J
on 18 Mar 2019

Edited: Matt J
on 18 Mar 2019

You can use lsqcurvefit to do so, simply by posing the problem as a 30-parameter fit. If you do this, you will want to provide a sparse Jacobian calculation of your own.

However, it might be more efficient, since your eta change gradually, to just run a loop over the data sets, each time initializing eta with the final value from the previous fit.

Matt J
on 19 Mar 2019

Can I surface plot be modified to only show the surface up until the data stops?

There is also this example on the FEX explaining how to generate surface plots from a list of [x,y,z] coordinate triplets. The surface should cover only the x,y-region covered by your triplets.

John D'Errico
on 18 Mar 2019

I'll suggest that while you CAN do at least something, it is perhaps not trivial. And, sadly, the result would likely be total, complete crapola. That will be more true IF these polynomials are of high order, or the data has a significant amount of noise in it.

Is there some capability built into MATLAB to do it? Of course not. Software providers tend not to write code that nobody has ever requested, nor would they want to do it for good reason. But can you write it? Well, it is not difficult, at least in theory.

You can think of each polynomial fit as a list of polynomial coefficients. There is no real gain from trying to do all the fits at one time. Polyfit would suffice, used in a loop. So create the entire set of coefficients, fo ALL values of eta. Now, just use interp1 to interpolate the coefficients for any given eta. The result will be another list of coefficients, at your chosen eta.

For example, lets try something simple.

eta = linspace(0,pi/2,10);

x = linspace(-0.25,0.25,21)';

yfun = @(x,eta) sin(x + eta);

y = yfun(x,eta);

surf(eta,x,y)

xlabel eta

ylabel x

zlabel y

So the surface is rather well-behaved. I would predict that for any value of eta, a low order polynomial will suffice.

Effectively, I have just created a 2 variable function. Each column of y corresponds to a different value for eta, so effectively, we have different functions.

In theory, I suppose a good way to solve this would be to create a 2 variable Taylor series. But lets assume we don't have that function. Over a reasonably short range, a low order polynomial might suffice. Again, if you need to use a high order polynomial for this, expect complete crapola, since high order polynomial fits tend to hugely amplify any noise in the data. That amplification will be reflected as almost random (and huge) noise in the coefficients. However, with no noise at all, we can do quite well here.

For example, consider just one of those polynomial models. I'll use the curvefitting toolbox here for the fit, since it reports statistics around the estimated coefficients.

P = fit(x,y(:,7),'poly5')

P =

Linear model Poly5:

P(x) = p1*x^5 + p2*x^4 + p3*x^3 + p4*x^2 + p5*x + p6

Coefficients (with 95% confidence bounds):

p1 = 0.004156 (0.004073, 0.004239)

p2 = 0.03597 (0.03596, 0.03598)

p3 = -0.08333 (-0.08334, -0.08333)

p4 = -0.433 (-0.433, -0.433)

p5 = 0.5 (0.5, 0.5)

p6 = 0.866 (0.866, 0.866)

Note that the bounds on the coefficients are actually pretty tight. Again, NO noise in the data was crucial. As well, this is a pretty simple function, very well behaved.

So now, lets just form a big loop. I'll use polyfit.

pdegree = 5;

coeff = zeros(numel(eta),pdegree + 1);

for i = 1:numel(eta)

coeff(i,:) = polyfit(x,y(:,i),pdegree);

end

coeff

coeff =

0.008312 8.7969e-15 -0.16667 -6.5333e-16 1 -7.1363e-19

0.0081857 0.0072132 -0.16413 -0.086824 0.98481 0.17365

0.0078107 0.014207 -0.15661 -0.17101 0.93969 0.34202

0.0071984 0.02077 -0.14434 -0.25 0.86603 0.5

0.0063674 0.026701 -0.12767 -0.32139 0.76604 0.64279

0.0053428 0.031821 -0.10713 -0.38302 0.64279 0.76604

0.004156 0.035974 -0.083333 -0.43301 0.5 0.86603

0.0028429 0.039034 -0.057003 -0.46984 0.34202 0.93969

0.0014434 0.040908 -0.028941 -0.4924 0.17365 0.98481

-9.8932e-13 0.041539 8.3207e-14 -0.5 -1.2291e-15 1

Those coefficients are actually pretty well behaved, but again, I was careful to have VERY good data. We can now use interp1 very simply, to predict the value of the function for any given eta.

etahat = 0.437;

coeffhat = interp1(eta,coeff,etahat,'pchip')

coeffhat =

0.0075322 0.017581 -0.15103 -0.21162 0.90618 0.42324

yhat = polyval(coeffhat,x);

plot(x,yfun(x,etahat),'ob')

hold on

plot(x,yhat,'-r*')

As you can see, the predictive power of the model is great. But that was without any noise at all in my data, and it was a reasonably low order polynomial model. And one other important fact is I used a vector x that was already centered and scaled, so the very best way to fit such a polynomial.

yn = y + randn(size(y))/100;

surf(eta,x,yn)

So here we have some noise in the data.

pdegree = 5;

coeffn = zeros(numel(eta),pdegree + 1);

for i = 1:numel(eta)

coeffn(i,:) = polyfit(x,yn(:,i),pdegree);

end

So a tiny amount of noise in y. What did it do to the coefficient of x^5 in those models?

plot(eta,coeff(:,1),'b-o',eta,coeffn(:,1),'r--')

legend('No noise in the data','Just a little noise in the data')

So a relatively tiny amount of noise, caused those polynomial coefficients to go to relatively insane.

etahat = 0.437;

coeffnhat = interp1(eta,coeffn,etahat,'pchip')

coeffnhat =

30.655 11.3 -2.3386 -0.94558 0.93562 0.43009

coeffhat

coeffhat =

0.0075322 0.017581 -0.15103 -0.21162 0.90618 0.42324

ynhat = polyval(coeffnhat,x);

plot(x,yfun(x,etahat),'ob')

hold on

plot(x,ynhat,'-r*')

Luckily, this model is so close to being a linear one, over a short range in x, that the higher order interpolated polynomial coefficients were not that large and that nasty that we had a big problem.

Again, things could easily have been worse, but as I said, this was almost a perfect case, where things will not be that bad, because the a low order polynomail fit will do quite well, and those high order coefficients were not too costly.

So you CAN use an approach as I showed above. Will it work? Let me just say it is fraught with danger, and if you don't know what you are doing and what to watch out for, expect problems down the line, ESPECIALLY if you have any significant noise or you try to use high order polynomial models.

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!