• PDF
• Cite
• Share
Article Contents  Article Contents

# Comparisons of different methods for balanced data classification under the discrete non-local total variational framework

• * Corresponding author: Zhenkuan Pan
• Because balanced constraints can overcome the problems of trivial solutions of data classification via minimum cut method, many techniques with different balanced strategies have been proposed to improve data classification accuracy. However, their performances have not been compared comprehensively so far. In this paper, we investigate seven balanced classification methods under the discrete non-local total variational framework and compare their accuracy performances on graph. The two-class classification problem with equality constraints, inequality constraints and Ratio Cut, Normalized Cut, Cheeger Cut models are investigated. For cases of equality constraint, we firstly compare the Penalty Function Method (PFM) and the Augmented Lagrangian Method (ALM), which can transform the constrained problems into unconstrained ones, to show the advantages of ALM. The other cases are all solved using the ALM also. In order to make the comparison fairly, we solve all models using ALM method and using the same proportion of fidelity points and the same neighborhood size on graph. Experimental results demonstrate ALM with the equality balanced constraint has the best classification accuracy compared with other six constraints. 200 words.

Mathematics Subject Classification: Primary: 65K10; Secondary: 65F22, 65N20.

 Citation: • • Figure 1.  Reference basis of three data sets

Figure 2.  Reference basis of three data sets

Figure 3.  Different results for two-moon dataset classification

Figure 4.  Different results for handwritten digits classification of 3 & 8 dataset

Figure 5.  Different results for handwritten digit classification of 4 & 9 dataset

Table 0.  Different algorithm's parameters

 EC-PFM EC-ALM SDIC DDIC RC CC NC $\mu_0$ 0.01 0.005 $\frac{3k}{n}$ $\frac{3k}{n}\cdot n$ 0.01 $\mu_1$ 15 25 20 15 15 12 15

Table 1.  Accuracy comparisons ofdifferent algorithms for two-moon dataset classification

 Method $V_1$.class $V_2$.class Error Ranking Solution 1000 1000 EC-PFM 1014 986 1.7$\%$ 5 EC-ALM 1005 995 1.25$\%$ 1 SDIC 1016 984 1.50$\%$ 2 DDIC 1015 985 1.55$\%$ 3 RC 1030 970 1.70$\%$ 5 CC 1021 979 1.65$\%$ 4 NC 986 1014 1.90$\%$ 7

Table 2.  Accuracy comparisons of different algorithms for handwritten digit classification of 3 & 8 dataset

 Method $V_1$.class $V_2$.class Error Ranking Solution 7141 6825 EC-PFM 7165 6801 1.2745$\%$ 6 EC-ALM 7139 6827 1.1385$\%$ 1 SDIC 7128 6838 1.1671$\%$ 3 DDIC 7161 6805 1.2244$\%$ 4 RC 7173 6793 1.2387$\%$ 5 CC 7134 6832 1.1600$\%$ 2 NC 7167 6799 1.5538$\%$ 7

Table 3.  Accuracy comparisons of different algorithms for handwritten digit classification of 4 & 9 dataset

 Method $V_1$.class $V_2$.class Error Ranking Solution 6824 6958 EC-PFM 6814 6968 1.2988$\%$ 4 EC-ALM 6818 6964 1.2335$\%$ 1 SDIC 6799 6983 1.3133$\%$ 5 DDIC 6811 6971 1.3206$\%$ 6 RC 6836 6946 1.2698$\%$ 3 CC 6787 6995 1.2407$\%$ 2 NC 6807 6975 1.4729$\%$ 7

Table 4.  The dependences of error rate on fidelity set size

 Two-moons 3&8 Fidelity set size(per class) Error rate Fidelity set size(per class) Error rate 100 1.20$\%$ 350 1.1287$\%$ 50 1.25$\%$ 300 1.1385 $\%$ 40 1.65$\%$ 250 1.2888$\%$ 30 1.80$\%$ 200 1.3604$\%$ 26 1.85$\%$ 150 1.6397$\%$ 20 2.45$\%$ 100 3.8665$\%$ 16 2.55$\%$ 90 4.1386$\%$ 10 3.25$\%$ 70 9.8597$\%$ 6 4.20$\%$ 50 11.2201$\%$

Table 5.  Dependence of error rate on DDIC range parameter

 $\varepsilon$value $V_1$.class $V_2$.class Error rate 5 7153 6813 1.2316$\%$ 10 7160 6806 1.2387$\%$ 15 7121 6845 1.2459$\%$ 20 7159 6807 1.2602$\%$ 30 7160 6806 1.2576$\%$ 50 7161 6805 1.2724$\%$
• Figures(5)

Tables(6)

## Article Metrics  DownLoad:  Full-Size Img  PowerPoint