Warning: Declaration of action_plugin_tablewidth::register(&$controller) should be compatible with DokuWiki_Action_Plugin::register(Doku_Event_Handler $controller) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/tablewidth/action.php on line 93
Warning: Declaration of syntax_plugin_mathjax_protecttex::render($mode, &$renderer, $data) should be compatible with DokuWiki_Syntax_Plugin::render($format, Doku_Renderer $renderer, $data) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/mathjax/syntax/protecttex.php on line 15 assignments:assignment3 [CS545 fall 2016]
Express the closest centroid algorithm in terms of kernels, i.e. determine how the coefficients $\alpha_i$ will be computed using a given labeled dataset.
Express the closest centroid algorithm in terms of kernels, i.e. determine how the coefficients $\alpha_i$ will be computed using a given labeled dataset.
-
===== Part 3: Using SVMs =====
+
===== Part 3: Soft-margin for separable data =====
+
+
Consider training a soft-margin SVM
+
with $C$ set to some positive constant. Suppose the training data is linearly separable.
+
Since increasing the $\xi_i$ can only increase the objective of the primal problem (which
+
we are trying to minimize), at the optimal solution to the primal problem, all the
+
training examples will have $\xi_i$ equal
+
to zero. True or false? Explain!
+
Given a linearly separable dataset, is it necessarily better to use a
+
a hard margin SVM over a soft-margin SVM?
+
+
===== Part 4: Using SVMs =====
The data for this question comes from a database called SCOP (structural
The data for this question comes from a database called SCOP (structural
assignments/assignment3.txt · Last modified: 2016/09/20 09:34 by asa