All Issues

Volume 5, 2022

Volume 4, 2021

Volume 3, 2020

Volume 2, 2019

Volume 1, 2018

Mathematical Foundations of Computing

May 2019 , Volume 2 , Issue 2

Select all articles


"Reducing the number of dimensions of the possible solution space" as a method for finding the exact solution of a system with a large number of unknowns
Aleksa Srdanov, Radiša Stefanović, Aleksandra Janković and Dragan Milovanović
2019, 2(2): 83-93 doi: 10.3934/mfc.2019007 +[Abstract](2655) +[HTML](825) +[PDF](311.88KB)

Solving linear systems with a relatively large number of equationsand unknowns can be achieved using an approximate method to obtain a solution with specified accuracy within numerical mathematics. Obtaining theexact solution using the computer today is only possible within the frameworkof symbolic mathematics. It is possible to define an algorithm that does notsolve the system of equations in the usual mathematical way, but still findsits exact solution in the exact number of steps already defined. The methodconsists of simple computations that are not cumulative. At the same time,the number of operations is acceptable even for a relatively large number ofequations and unknowns. In addition, the algorithm allows the process to startfrom an arbitrary initial n-tuple and always leads to the exact solution if itexists.

Online learning for supervised dimension reduction
Ning Zhang and Qiang Wu
2019, 2(2): 95-106 doi: 10.3934/mfc.2019008 +[Abstract](3512) +[HTML](854) +[PDF](430.45KB)

Online learning has attracted great attention due to the increasingdemand for systems that have the ability of learning and evolving. When thedata to be processed is also high dimensional and dimension reduction is necessary for visualization or prediction enhancement, online dimension reductionwill play an essential role. The purpose of this paper is to propose a new onlinelearning approach for supervised dimension reduction. Our algorithm is motivated by adapting the sliced inverse regression (SIR), a pioneer and effectivealgorithm for supervised dimension reduction, and making it implementable inan incremental manner. The new algorithm, called incremental sliced inverseregression (ISIR), is able to update the subspace of significant factors with intrinsic lower dimensionality fast and efficiently when new observations come in.We also refine the algorithm by using an overlapping technique and develop anincremental overlapping sliced inverse regression (IOSIR) algorithm. We verifythe effectiveness and efficiency of both algorithms by simulations and real dataapplications.

Nonconvex mixed matrix minimization
Meijuan Shang, Yanan Liu, Lingchen Kong, Xianchao Xiu and Ying Yang
2019, 2(2): 107-126 doi: 10.3934/mfc.2019009 +[Abstract](3130) +[HTML](929) +[PDF](477.87KB)

Given the ultrahigh dimensionality and the complex structure, which contains matrices and vectors, the mixed matrix minimization becomes crucial for the analysis of those data. Recently, the nonconvex functions such as the smoothly clipped absolute deviation, the minimax concave penalty, the capped \begin{document}$ \ell_1 $\end{document}-norm penalty and the \begin{document}$ \ell_p $\end{document} quasi-norm with \begin{document}$ 0<p<1 $\end{document} have been shown remarkable advantages in variable selection due to the fact that they can overcome the over-penalization. In this paper, we propose and study a novel nonconvex mixed matrix minimization, which combines the low-rank and sparse regularzations and nonconvex functions perfectly. The augmented Lagrangian method (ALM) is proposed to solve the noncovnex mixed matrix minimization problem. The resulting subproblems either have closed-form solutions or can be solved by fast solvers, which makes the ALM particularly efficient. In theory, we prove that the sequence generated by the ALM converges to a stationary point when the penalty parameter is above a computable threshold. Extensive numerical experiments illustrate that our proposed nonconvex mixed matrix minimization model outperforms the existing ones.

Unsupervised robust nonparametric learning of hidden community properties
Mikhail Langovoy, Akhilesh Gotmare and Martin Jaggi
2019, 2(2): 127-147 doi: 10.3934/mfc.2019010 +[Abstract](2635) +[HTML](706) +[PDF](4649.4KB)

We consider learning of fundamental properties of communities in large noisy networks, in the prototypical situation where the nodes or users are split into two classes according to a binary property, e.g., according to their opinions or preferences on a topic. For learning these properties, we propose a nonparametric, unsupervised, and scalable graph scan procedure that is, in addition, robust against a class of powerful adversaries. In our setup, one of the communities can fall under the influence of a knowledgeable adversarial leader, who knows the full network structure, has unlimited computational resources and can completely foresee our planned actions on the network. We prove strong consistency of our results in this setup with minimal assumptions. In particular, the learning procedure estimates the baseline activity of normal users asymptotically correctly with probability 1; the only assumption being the existence of a single implicit community of asymptotically negligible logarithmic size. We provide experiments on real and synthetic data to illustrate the performance of our method, including examples with adversaries.

Nonlinear diffusion based image segmentation using two fast algorithms
Lu Tan, Ling Li, Senjian An and Zhenkuan Pan
2019, 2(2): 149-168 doi: 10.3934/mfc.2019011 +[Abstract](3241) +[HTML](852) +[PDF](1699.09KB)

In this paper, a new variational model is proposed for image segmentation based on active contours, nonlinear diffusion and level sets. It includes a Chan-Vese model-based data fitting term and a regularized term that uses the potential functions (PF) of nonlinear diffusion. The former term can segment the image by region partition instead of having to rely on the edge information. The latter term is capable of automatically preserving image edges as well as smoothing noisy regions. To improve computational efficiency, the implementation of the proposed model does not directly solve the high order nonlinear partial differential equations and instead exploit the efficient alternating direction method of multipliers (ADMM), which allows the use of fast Fourier transform (FFT), analytical generalized soft thresholding equation, and projection formula. In particular, we creatively propose a new fast algorithm, normal vector projection method (NVPM), based on alternating optimization method and normal vector projection. Its stability can be the same as ADMM and it has faster convergence ability. Extensive numerical experiments on grey and colour images validate the effectiveness of the proposed model and the efficiency of the algorithms.

An RKHS approach to estimate individualized treatment rules based on functional predictors
Jun Fan, Fusheng Lv and Lei Shi
2019, 2(2): 169-181 doi: 10.3934/mfc.2019012 +[Abstract](2585) +[HTML](767) +[PDF](362.21KB)

In recent years there has been massive interest in precision medicine, which aims to tailor treatment plans to the individual characteristics of each patient. This paper studies the estimation of individualized treatment rules (ITR) based on functional predictors such as images or spectra. We consider a reproducing kernel Hilbert space (RKHS) approach to learn the optimal ITR which maximizes the expected clinical outcome. The algorithm can be conveniently implemented although it involves infinite-dimensional functional data. We provide convergence rate for prediction under mild conditions, which is jointly determined by both the covariance kernel and the reproducing kernel.

2021 CiteScore: 0.2



Special Issues

Email Alert

[Back to Top]