Warning: Declaration of action_plugin_tablewidth::register(&$controller) should be compatible with DokuWiki_Action_Plugin::register(Doku_Event_Handler $controller) in /s/bach/b/class/cs545/public_html/fall16/lib/plugins/tablewidth/action.php on line 93
assignments:assignment3 [CS545 fall 2016]

User Tools

Site Tools


assignments:assignment3

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
assignments:assignment3 [2015/10/02 12:12]
asa
assignments:assignment3 [2015/10/07 12:04]
asa
Line 7: Line 7:
 Formulate a soft-margin SVM without the bias term, i.e. one where the discriminant function is equal to $\mathbf{w}^{T} \mathbf{x}$. Formulate a soft-margin SVM without the bias term, i.e. one where the discriminant function is equal to $\mathbf{w}^{T} \mathbf{x}$.
 Derive the saddle point conditions, KKT conditions and the dual. Derive the saddle point conditions, KKT conditions and the dual.
-Compare it to the standard SVM formulation. +Compare it to the standard SVM formulation ​that was derived in class
-As we discussed ​in class, ​SMO-type algorithms for the dual optimize the smallest number of variables at a time, which is two variables+In class we discussed SMO-type algorithms for optimizing ​the dual SVM.  At each step SMO optimizes two variables at a time, which is the smallest number possible
-Is this still the case for the formulation you have derived?+Is this still the case for the formulation you have derived?  In other words, is two the smallest number of variables that can be optimized at a time?
 Hint:  consider the difference in the constraints. Hint:  consider the difference in the constraints.
  
Line 64: Line 64:
 K_{gauss}(\mathbf{x},​ \mathbf{x'​}) = \exp(-\gamma || \mathbf{x} - \mathbf{x}'​ ||^2) K_{gauss}(\mathbf{x},​ \mathbf{x'​}) = \exp(-\gamma || \mathbf{x} - \mathbf{x}'​ ||^2)
 $$ $$
 +and
 $$ $$
-K_{poly}(\mathbf{x},​ \mathbf{x'​}) = (\mathbf{x}^T \mathbf{x}'​ + 1) ^{p}+K_{poly}(\mathbf{x},​ \mathbf{x'​}) = (\mathbf{x}^T \mathbf{x}'​ + 1) ^{p}.
 $$ $$
  
Line 110: Line 111:
 Part 1:  40 points. Part 1:  40 points.
 (10 points): ​ Primal SVM formulation is correct (10 points): ​ Primal SVM formulation is correct
-(10 points): ​ Lagrangian found correctly +points): ​ Lagrangian found correctly 
-(10 points): ​ Derivation of saddle point equations+points): ​ Derivation of saddle point equations
 (10 points): ​ Derivation of the dual (10 points): ​ Derivation of the dual
 ( 5 points): ​ Discussion of the implication of the form of the dual for SMO-like algorithms ( 5 points): ​ Discussion of the implication of the form of the dual for SMO-like algorithms
assignments/assignment3.txt · Last modified: 2016/09/20 09:34 by asa