machine learning - Explanation for Coordinate Descent and Subgradient -


how easy explanation of coordinate descent , subgradient solution in context of lasso.

an intuitive explanation followed proof helpful.

suppose have multivariate function f(w) k number of variables/parameters w (w_1, w_2, w_3, ..., w_k). parameters knobs , goal change these knobs in way f minimized function f. coordinate descent greedy method sense on each iteration change values of parameters w_i minimize f. easy implement , gradient descent guaranteed minimize f on each iteration , reach local minima.

enter image description here

picture borrowed internet through bing image search

as shown in picture above, function f has 2 parameters x , y. on each iteration either both of parameters changed fixed value c , value of function evaluated @ new point. if value higher , goal minimize function, change reversed selected parameter. same procedure done second parameter. 1 iteration of algorithm.

an advantage of using coordinate descent in problems computing gradient of function expensive.

sources


Comments

Popular posts from this blog

get url and add instance to a model with prefilled foreign key :django admin -

css - Make div keyboard-scrollable in jQuery Mobile? -

ruby on rails - Seeing duplicate requests handled with Unicorn -