ex-7: completed

This commit is contained in:
Giù Marcer 2020-04-10 21:55:07 +02:00 committed by rnhmjoj
parent 08bea1bbe8
commit e60d32591d

View File

@ -296,20 +296,18 @@ points were divided into the two classes according to the selected method.
At each iteration, false positives and negatives are recorded using a running
statistics method implemented in the `gsl_rstat` library, being suitable for
handling large datasets for which it is inconvenient to store in memory all at
once.
once.
For each sample, the numbers $N_{fn}$ and $N_{fp}$ of false positive and false
negative are computed with the following trick:
Every noise point $x_n$ was checked this way: the function $f(x_n)$ was computed
with the weight vector $w$ and the $t_{\text{cut}}$ given by the employed method,
then:
negative are computed with the following trick: every noise point $x_n$ was
checked this way: the function $f(x_n)$ was computed with the weight vector $w$
and the $t_{\text{cut}}$ given by the employed method, then:
- if $f(x) < 0 \thus$ $N_{fn} \to N_{fn}$
- if $f(x) > 0 \thus$ $N_{fn} \to N_{fn} + 1$
Similarly for the positive points.
Finally, the mean and the standard deviation were obtained from $N_{fn}$ and
$N_{fp}$ computed for every sample in order to get the mean purity $\alpha$
Finally, the mean and the standard deviation were computed from $N_{fn}$ and
$N_{fp}$ obtained for every sample in order to get the mean purity $\alpha$
and efficiency $\beta$ for the employed statistics:
$$
@ -317,7 +315,16 @@ $$
\beta = 1 - \frac{\text{mean}(N_{fp})}{N_n}
$$
Results for $N_t = 500$:
Results for $N_t = 500$ are shown in @tbl:res_comp. As can be observed, the
Fisher method gives a nearly perfect assignment of the points to their belonging
class, with a symmetric distribution of false negative and false positive,
whereas the points perceptron-divided show a little more false-positive than
false-negative, being also more changable from dataset to dataset.
The reason why this happened lies in the fact that the Fisher linear
discriminant is an exact analitical result, whereas the perceptron is based on
a convergent behaviour which cannot be exactely reached by definition.
-------------------------------------------------------------------------------------------
$\alpha$ $\sigma_{\alpha}$ $\beta$ $\sigma_{\beta}$
@ -329,6 +336,4 @@ Perceptron 0.9999 0.28 0.9995 0.64
Table: Results for Fisher and perceptron method. $\sigma_{\alpha}$ and
$\sigma_{\beta}$ stand for the standard deviation of the false
negative and false positive respectively.
\textcolor{red}{MISSING COMMENTS ON RESULTS.}
negative and false positive respectively. {#tbl:res_comp}