algorithm - Finding "line of best fit" using linear programming -


this homework question. i'm asked find coefficients of line of best fit given set of n dots(2d). coefficients b c in: ax+by=c.say there're n dots, use linear programming find coefficients leads smallest 'maximum absolute error', defined as: max(|a*xi+b*yi-c|), ranges 1-n.

here's thought process:

let m denote maximum absolute error. objective of linear programming minimize m. since m biggest of |a*xi+b*yi-c|, must bigger every 1 of them. (a*xi+b*yi-c)<= m, , (a*xi+b*yi-c)>= -m, (the second expression account absolute sign).

i thought sufficient define problem. when put conditions solver, returned b c equal 0, in reality shouldn't. think i'm missing conditions here. can point out me?

you should add 1 statement is: either or b should not 0. if both values 0 have valid solution system there no line both , b equal 0.

edit: improving rerito's suggestion. line has either or b not equal 0. alo lines (k*a)*x + (k*b)* y + (k*c) , (a)*x + (b)* y + (c) same non-zero k. need run solver twice- once when specifying 1 , once when specifying b 1 , after select better solution. have run solver twice because might case best solution has a=0 or b=0(but not both).


Comments

Popular posts from this blog

android - getbluetoothservice() called with no bluetoothmanagercallback -

sql - ASP.NET SqlDataSource, like on SelectCommand -

ios - Undefined symbols for architecture armv7: "_OBJC_CLASS_$_SSZipArchive" -