doi: 10.3934/fods.2022011
Online First

Online First articles are published articles within a journal that have not yet been assigned to a formal issue. This means they do not yet have a volume number, issue number, or page numbers assigned to them, however, they can still be found and cited using their DOI (Digital Object Identifier). Online First publication benefits the research community by making new scientific discoveries known as quickly as possible.

Readers can access Online First articles via the “Online First” tab for the selected journal.

Multimodal correlations-based data clustering

1. 

Department of Electrical and Computer Engineering, University of California at Riverside, CA, 92521

2. 

Department of Electrical Engineering, University of Texas at Arlington, TX, 76010

*Corresponding author: Ioannis Schizas

Received  January 2022 Revised  April 2022 Early access May 2022

Fund Project: Work in this paper is supported by the Army Research Office grant W911NF- 21-1-0231 and the Department of Defense Grant No. W911NF-21-1-0169

This work proposes a novel technique for clustering multimodal data according to their information content. Statistical correlations present in data that contain similar information are exploited to perform the clustering task. Specifically, multiset canonical correlation analysis is equipped with norm-one regularization mechanisms to identify clusters within different types of data that share the same information content. A pertinent minimization formulation is put forth, while block coordinate descent is employed to derive a batch clustering algorithm which achieves better clustering performance than existing alternatives. Relying on subgradient descent, an online clustering approach is derived which substantially lowers computational complexity compared to the batch approach, while not compromising significantly the clustering performance. It is established that for an increasing number of data the novel regularized multiset framework is able to correctly cluster the multimodal data entries. Further, it is proved that the online clustering scheme converges with probability one to a stationary point of the ensemble regularized multiset correlations cost having the potential to recover the correct clusters. Extensive numerical tests demonstrate that the novel clustering scheme outperforms existing alternatives, while the online scheme achieves substantial computational savings.

Citation: Jia Chen, Ioannis D. Schizas. Multimodal correlations-based data clustering. Foundations of Data Science, doi: 10.3934/fods.2022011
References:
[1]

A. Abdullin and O. Nasraoui, Clustering heterogeneous data sets, 2012 Eighth Latin American Web Congress, Cartagena de Indias, (2012), 1–8.

[2]

K. Abhishek and D. I. Hal, A Co-training approach for multi-view spectral clustering, Proc. of the 28th Intl. Conf. on Machine Learning, Bellevue, WA, (2011), 393–400.

[3]

A. Aduroja, I. D. Schizas and V. Maroulas, Distributed Principal Component Analysis in Sensor Netwokrs, 2013 IEEE Intl. Conf. on Acoust., Speech and Sig. Processing, Vancouver, CAN, 2013.

[4]

B. Ardekani and I. Kanno, Statistical methods for detecting activated regions in functional MRI of the brain, Magnetic Reasonance Imaging, 16 (1998), 1217-1225.  doi: 10.1016/S0730-725X(98)00125-8.

[5]

A. Banerjee, I. Dhillon, J. Ghosh and S. Sra, Generative model-based clustering of directional data, Proc. of ACM SIGKDD Intl. Conf. on Knowledge Disc. and Data Mining, Washington, DC, (2003), 19–28.

[6]

D. P. Bertsekas, Nonlinear Programming, 2nd edition, Athena Scientific, Massachussets, 1999.

[7]

D. P. Bertsekas and J. N. Tsitsiklis, Gradient convergence in gradient methods with errors, SIAM J. Optim., 10 (2000), 627-642.  doi: 10.1137/S1052623497331063.

[8]

P. Billingsley, Probability and Measure, 3rd edition, John Wiley and Sons Inc., New York, 1995.

[9]

S. Boyd, L. Xiao and A. Mutapcic, Subgradient methods, lecture notes of EE392o, Stanford University, Autumn Quarter, 2003–2004.

[10]

D. R. Brillinger, Time Series: Data Analysis and Theory, 2nd edition, Holden-Day Series in Time Series Analysis. Holden-Day, Inc., Oakland, Calif., 1981.

[11]

J. Chen and I. D. Schizas, Online distributed sparsity-aware canonical correlation analysis, IEEE Trans. Signal Process., 64 (2016), 688-703.  doi: 10.1109/TSP.2015.2481861.

[12]

J. Chen and I. D. Schizas, Distributed information-based clustering of heterogeneous sensor data, Elsevier Signal Processing, 126 (2016), 35-51.  doi: 10.1016/j.sigpro.2015.12.017.

[13]

A. Choromanska and C. Monteleoni, Online clustering with experts, In Proc. Of 15th Int. Conf. on Artificial Intelligence and Statistics (AISTATS), La Palma, Canary Islands, (2012), 1–182.

[14]

N. M. CorreaT. AdaliY. O. Li and V. D. Calhoun, Canonical correlation analysis for data fusion and group inferences, IEEE Signal Processing Magazine, 27 (2010), 39-50. 

[15]

P. A. Devijver and J. Kittler, Pattern Recognition: A Statistical Approach, 1st edition, Prentice Hall, London, 1982.

[16]

D. R. Hardoon and J. Taylor, The Double-Barrelled Lasso, Learning from Multiple Sources Workshop, Advances on Neural Information Processing Systems, Vancouver, Canada, 2008.

[17]

S. M. Kakade and D. P. Foster, Multi-view regression via canonical correlation analysis, Conf. Learning Thy, 4539 (2007), 82-96.  doi: 10.1007/978-3-540-72927-3_8.

[18]

J. R. Kettenring, Canonical analysis of several sets of variables, Biometrika, 58 (1971), 433-451.  doi: 10.1093/biomet/58.3.433.

[19]

V. KhalidovF. Forbes and R. Horaud, Conjugate mixture models for clustering multimodal data, Neural Computation, 23 (2011), 517-557.  doi: 10.1162/NECO_a_00074.

[20]

G. LeeA. SinganamalliH. WangM. D. FeldmanS. R. MasterN. N. C. ShihE. SpanglerT. RebbeckJ. E. Tomaszewski and A. Madabhushi, Supervised multi-view canonical correlation analysis: Integrating histologic and proteomic features for predicting recurrent prostate cancer,, IEEE Trans. Med. Imag., 34 (2015), 284-297.  doi: 10.1109/TMI.2014.2355175.

[21]

M. LeeH. ShenJ. Z. Huang and J. S. Marron, Biclustering via sparse singular value decomposition, Biometrics, 66 (2010), 1087-1095.  doi: 10.1111/j.1541-0420.2010.01392.x.

[22]

D. LinJ. ZhangJ. LiV. D. CalhounH.-W. Deng and Y. P. Wang, Group sparse canonical correlation analysis for genomic data integration, BMC Bioinformatics, 14 (2013), 1-16. 

[23]

S. Lloyd, Least squares quantization in PCM, IEEE Trans. Inf. Theory, 28 (1982), 129-137.  doi: 10.1109/TIT.1982.1056489.

[24]

Y. LuoD. TaoK. RamamohanaraoC. Xu and Y. Wen, Tensor canonical correlation analysis for multi-view dimension reduction, IEEE Transactions on Knowledge and Data Engineering, 27 (2015), 3111-3124. 

[25]

A. A. Malikopoulos, V. Maroulas and J. Xiong, A multiobjective optimization framework for stochastic control of complex systems, 2015 American Control Conference (ACC), (2015), 4263–4268.

[26]

X. Mao, X. Miao, Y. He, T. Zhu, J. Wang, W. Dong, X.-Yang Li and Y. Liu, CitySee: Urban CO2 monitoring with sensors, Proc. IEEE Int. Conf. Comput. Commun., (2012), 1611–1619.

[27]

A. K. Nandi, Data modeling with polynomial representations and autoregressive time-series representations, and their connections, IEEE Access, 8 (2020), 110412-110424.  doi: 10.1109/ACCESS.2020.3000860.

[28]

A. A. Nielsen, Multiset canonical correlations analysis and multispectral truly multi-temporal remote sensing data, IEEE Trans. Image Process., 11 (2002), 293-305.  doi: 10.1109/83.988962.

[29]

E. E. PapalexakisN. D. Sidiropoulos and R. Bro, From k-means to higher-way co-clustering: Multilinear decomposition with sparse latent factors, IEEE Trans. Signal Processing, 61 (2012), 493-506.  doi: 10.1109/TSP.2012.2225052.

[30]

E. Parkhomenko, D. Tritcher and J. Beyene, Sparse canonical correlation analysis with application to genomic data integration, Stat. Appl. Genet. Mol. Biol., 8 (2009), Art. 1, 36 pp. doi: 10.2202/1544-6115.1406.

[31]

K. B. Petersen and M. S. Pedersen, The Matrix Cookbook, version 2012115, 2012. Available from: https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf.

[32]

M. R. Pressman, Primer of Polysomnogram Interpretation, 1st edition, Butterworth-Heinemann, 2002.

[33]

J. Rupnik and J. S. Taylor, Multi-view canonical correlation analysis, Proc. Slovenian KDD Conf. Data Mining and Data Warehouses (SiKDD), Ljubljana, Slovenia, Oct. (2010), 1–4.

[34]

I. D. Schizas and G. B. Giannakis, Covariance eigenvector sparsity for data compression and denoising, IEEE Trans. on Signal Processing, 60 (2012), 2408-2421.  doi: 10.1109/TSP.2012.2186130.

[35]

S. Simi$\grave{\textrm{c}}$ and S. Sastry, Distributed environmental monitoring using random sensor networks, Information Processing in Sensor Networks, 2634 (2003), 582-592.  doi: 10.1007/3-540-36978-3_39.

[36]

R. Vidal, Subspace clustering, IEEE Signal Processing Magazine, 28 (2011), 52-68.  doi: 10.1109/MSP.2010.939739.

[37]

U. von Luxburg, A tutorial on spectral clustering, Stat. Comput., 17 (2007), 395-416.  doi: 10.1007/s11222-007-9033-z.

[38]

G. A. Watson, Characterization of the subdifferential of some matrix norms, Linear Algebra Appl., 170 (1992), 33-45.  doi: 10.1016/0024-3795(92)90407-2.

[39]

D. M. WittenR. Tibshirani and T. Hastie, A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis, Biostatistics, 10 (2009), 515-534. 

[40]

R. Xu and D. Wunsch, Survey of clustering algorithms, IEEE Trans. Neural Netw., 16 (2005), 645–678.

[41]

Y. ZhangJ. WuZ. Cai and P. S. Yu, Multi-view multi-label learning with sparse feature selection for image annotation, IEEE Trans. Multimedia, 22 (2020), 2844-2857.  doi: 10.1109/TMM.2020.2966887.

show all references

References:
[1]

A. Abdullin and O. Nasraoui, Clustering heterogeneous data sets, 2012 Eighth Latin American Web Congress, Cartagena de Indias, (2012), 1–8.

[2]

K. Abhishek and D. I. Hal, A Co-training approach for multi-view spectral clustering, Proc. of the 28th Intl. Conf. on Machine Learning, Bellevue, WA, (2011), 393–400.

[3]

A. Aduroja, I. D. Schizas and V. Maroulas, Distributed Principal Component Analysis in Sensor Netwokrs, 2013 IEEE Intl. Conf. on Acoust., Speech and Sig. Processing, Vancouver, CAN, 2013.

[4]

B. Ardekani and I. Kanno, Statistical methods for detecting activated regions in functional MRI of the brain, Magnetic Reasonance Imaging, 16 (1998), 1217-1225.  doi: 10.1016/S0730-725X(98)00125-8.

[5]

A. Banerjee, I. Dhillon, J. Ghosh and S. Sra, Generative model-based clustering of directional data, Proc. of ACM SIGKDD Intl. Conf. on Knowledge Disc. and Data Mining, Washington, DC, (2003), 19–28.

[6]

D. P. Bertsekas, Nonlinear Programming, 2nd edition, Athena Scientific, Massachussets, 1999.

[7]

D. P. Bertsekas and J. N. Tsitsiklis, Gradient convergence in gradient methods with errors, SIAM J. Optim., 10 (2000), 627-642.  doi: 10.1137/S1052623497331063.

[8]

P. Billingsley, Probability and Measure, 3rd edition, John Wiley and Sons Inc., New York, 1995.

[9]

S. Boyd, L. Xiao and A. Mutapcic, Subgradient methods, lecture notes of EE392o, Stanford University, Autumn Quarter, 2003–2004.

[10]

D. R. Brillinger, Time Series: Data Analysis and Theory, 2nd edition, Holden-Day Series in Time Series Analysis. Holden-Day, Inc., Oakland, Calif., 1981.

[11]

J. Chen and I. D. Schizas, Online distributed sparsity-aware canonical correlation analysis, IEEE Trans. Signal Process., 64 (2016), 688-703.  doi: 10.1109/TSP.2015.2481861.

[12]

J. Chen and I. D. Schizas, Distributed information-based clustering of heterogeneous sensor data, Elsevier Signal Processing, 126 (2016), 35-51.  doi: 10.1016/j.sigpro.2015.12.017.

[13]

A. Choromanska and C. Monteleoni, Online clustering with experts, In Proc. Of 15th Int. Conf. on Artificial Intelligence and Statistics (AISTATS), La Palma, Canary Islands, (2012), 1–182.

[14]

N. M. CorreaT. AdaliY. O. Li and V. D. Calhoun, Canonical correlation analysis for data fusion and group inferences, IEEE Signal Processing Magazine, 27 (2010), 39-50. 

[15]

P. A. Devijver and J. Kittler, Pattern Recognition: A Statistical Approach, 1st edition, Prentice Hall, London, 1982.

[16]

D. R. Hardoon and J. Taylor, The Double-Barrelled Lasso, Learning from Multiple Sources Workshop, Advances on Neural Information Processing Systems, Vancouver, Canada, 2008.

[17]

S. M. Kakade and D. P. Foster, Multi-view regression via canonical correlation analysis, Conf. Learning Thy, 4539 (2007), 82-96.  doi: 10.1007/978-3-540-72927-3_8.

[18]

J. R. Kettenring, Canonical analysis of several sets of variables, Biometrika, 58 (1971), 433-451.  doi: 10.1093/biomet/58.3.433.

[19]

V. KhalidovF. Forbes and R. Horaud, Conjugate mixture models for clustering multimodal data, Neural Computation, 23 (2011), 517-557.  doi: 10.1162/NECO_a_00074.

[20]

G. LeeA. SinganamalliH. WangM. D. FeldmanS. R. MasterN. N. C. ShihE. SpanglerT. RebbeckJ. E. Tomaszewski and A. Madabhushi, Supervised multi-view canonical correlation analysis: Integrating histologic and proteomic features for predicting recurrent prostate cancer,, IEEE Trans. Med. Imag., 34 (2015), 284-297.  doi: 10.1109/TMI.2014.2355175.

[21]

M. LeeH. ShenJ. Z. Huang and J. S. Marron, Biclustering via sparse singular value decomposition, Biometrics, 66 (2010), 1087-1095.  doi: 10.1111/j.1541-0420.2010.01392.x.

[22]

D. LinJ. ZhangJ. LiV. D. CalhounH.-W. Deng and Y. P. Wang, Group sparse canonical correlation analysis for genomic data integration, BMC Bioinformatics, 14 (2013), 1-16. 

[23]

S. Lloyd, Least squares quantization in PCM, IEEE Trans. Inf. Theory, 28 (1982), 129-137.  doi: 10.1109/TIT.1982.1056489.

[24]

Y. LuoD. TaoK. RamamohanaraoC. Xu and Y. Wen, Tensor canonical correlation analysis for multi-view dimension reduction, IEEE Transactions on Knowledge and Data Engineering, 27 (2015), 3111-3124. 

[25]

A. A. Malikopoulos, V. Maroulas and J. Xiong, A multiobjective optimization framework for stochastic control of complex systems, 2015 American Control Conference (ACC), (2015), 4263–4268.

[26]

X. Mao, X. Miao, Y. He, T. Zhu, J. Wang, W. Dong, X.-Yang Li and Y. Liu, CitySee: Urban CO2 monitoring with sensors, Proc. IEEE Int. Conf. Comput. Commun., (2012), 1611–1619.

[27]

A. K. Nandi, Data modeling with polynomial representations and autoregressive time-series representations, and their connections, IEEE Access, 8 (2020), 110412-110424.  doi: 10.1109/ACCESS.2020.3000860.

[28]

A. A. Nielsen, Multiset canonical correlations analysis and multispectral truly multi-temporal remote sensing data, IEEE Trans. Image Process., 11 (2002), 293-305.  doi: 10.1109/83.988962.

[29]

E. E. PapalexakisN. D. Sidiropoulos and R. Bro, From k-means to higher-way co-clustering: Multilinear decomposition with sparse latent factors, IEEE Trans. Signal Processing, 61 (2012), 493-506.  doi: 10.1109/TSP.2012.2225052.

[30]

E. Parkhomenko, D. Tritcher and J. Beyene, Sparse canonical correlation analysis with application to genomic data integration, Stat. Appl. Genet. Mol. Biol., 8 (2009), Art. 1, 36 pp. doi: 10.2202/1544-6115.1406.

[31]

K. B. Petersen and M. S. Pedersen, The Matrix Cookbook, version 2012115, 2012. Available from: https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf.

[32]

M. R. Pressman, Primer of Polysomnogram Interpretation, 1st edition, Butterworth-Heinemann, 2002.

[33]

J. Rupnik and J. S. Taylor, Multi-view canonical correlation analysis, Proc. Slovenian KDD Conf. Data Mining and Data Warehouses (SiKDD), Ljubljana, Slovenia, Oct. (2010), 1–4.

[34]

I. D. Schizas and G. B. Giannakis, Covariance eigenvector sparsity for data compression and denoising, IEEE Trans. on Signal Processing, 60 (2012), 2408-2421.  doi: 10.1109/TSP.2012.2186130.

[35]

S. Simi$\grave{\textrm{c}}$ and S. Sastry, Distributed environmental monitoring using random sensor networks, Information Processing in Sensor Networks, 2634 (2003), 582-592.  doi: 10.1007/3-540-36978-3_39.

[36]

R. Vidal, Subspace clustering, IEEE Signal Processing Magazine, 28 (2011), 52-68.  doi: 10.1109/MSP.2010.939739.

[37]

U. von Luxburg, A tutorial on spectral clustering, Stat. Comput., 17 (2007), 395-416.  doi: 10.1007/s11222-007-9033-z.

[38]

G. A. Watson, Characterization of the subdifferential of some matrix norms, Linear Algebra Appl., 170 (1992), 33-45.  doi: 10.1016/0024-3795(92)90407-2.

[39]

D. M. WittenR. Tibshirani and T. Hastie, A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis, Biostatistics, 10 (2009), 515-534. 

[40]

R. Xu and D. Wunsch, Survey of clustering algorithms, IEEE Trans. Neural Netw., 16 (2005), 645–678.

[41]

Y. ZhangJ. WuZ. Cai and P. S. Yu, Multi-view multi-label learning with sparse feature selection for image annotation, IEEE Trans. Multimedia, 22 (2020), 2844-2857.  doi: 10.1109/TMM.2020.2966887.

Figure 1.  Introducing sparsity to the proposed model: Given multimodal data (Datasets 1, 2 and 3) where rows in red, green, and orange correspond to measurements containing information for 3 different hidden patterns (a.k.a. sources) and rows in gray correspond to noise measurements. Enforcing sparsity on properly designed projection matrices (where red, green, and orange entries are related to the 3 hidden sources while white entries are zeros corresponding to noise) allows us to perform row clustering based on the "source" content of each row measurement
Figure 2.  Connection between $ \{\mathcal{S}^i\} $ and $ \{{\bf{x}}_i(\tau)\} $ in the example of Fig. 1
Figure 3.  Probability of correct data clustering versus number of data $ \tau $ for different values of $ M $.
Figure 4.  Running time (seconds) versus number of data $ \tau $ for different values of $ M $.
Figure 5.  Norm of the gradient for OM-CCA matrix iterates versus number of data $ \tau $ for $ M = 3 $ and $ M = 4 $.
[1]

Jia Cai, Junyi Huo. Sparse generalized canonical correlation analysis via linearized Bregman method. Communications on Pure and Applied Analysis, 2020, 19 (8) : 3933-3945. doi: 10.3934/cpaa.2020173

[2]

Weidong Bao, Wenhua Xiao, Haoran Ji, Chao Chen, Xiaomin Zhu, Jianhong Wu. Towards big data processing in clouds: An online cost-minimization approach. Big Data & Information Analytics, 2016, 1 (1) : 15-29. doi: 10.3934/bdia.2016.1.15

[3]

Yubo Yuan, Chenglong Ma, Dongmei Pu. A novel discriminant minimum class locality preserving canonical correlation analysis and its applications. Journal of Industrial and Management Optimization, 2016, 12 (1) : 251-268. doi: 10.3934/jimo.2016.12.251

[4]

Jiping Tao, Zhijun Chao, Yugeng Xi. A semi-online algorithm and its competitive analysis for a single machine scheduling problem with bounded processing times. Journal of Industrial and Management Optimization, 2010, 6 (2) : 269-282. doi: 10.3934/jimo.2010.6.269

[5]

Ruiqi Yang, Dachuan Xu, Yicheng Xu, Dongmei Zhang. An adaptive probabilistic algorithm for online k-center clustering. Journal of Industrial and Management Optimization, 2019, 15 (2) : 565-576. doi: 10.3934/jimo.2018057

[6]

Baolan Yuan, Wanjun Zhang, Yubo Yuan. A Max-Min clustering method for $k$-means algorithm of data clustering. Journal of Industrial and Management Optimization, 2012, 8 (3) : 565-575. doi: 10.3934/jimo.2012.8.565

[7]

Daniel Mckenzie, Steven Damelin. Power weighted shortest paths for clustering Euclidean data. Foundations of Data Science, 2019, 1 (3) : 307-327. doi: 10.3934/fods.2019014

[8]

Michael Herty, Lorenzo Pareschi, Giuseppe Visconti. Mean field models for large data–clustering problems. Networks and Heterogeneous Media, 2020, 15 (3) : 463-487. doi: 10.3934/nhm.2020027

[9]

Ji-Bo Wang, Bo Zhang, Hongyu He. A unified analysis for scheduling problems with variable processing times. Journal of Industrial and Management Optimization, 2022, 18 (2) : 1063-1077. doi: 10.3934/jimo.2021008

[10]

Min-Fan He, Li-Ning Xing, Wen Li, Shang Xiang, Xu Tan. Double layer programming model to the scheduling of remote sensing data processing tasks. Discrete and Continuous Dynamical Systems - S, 2019, 12 (4&5) : 1515-1526. doi: 10.3934/dcdss.2019104

[11]

Qinglei Zhang, Wenying Feng. Detecting coalition attacks in online advertising: A hybrid data mining approach. Big Data & Information Analytics, 2016, 1 (2&3) : 227-245. doi: 10.3934/bdia.2016006

[12]

Jiangchuan Fan, Xinyu Guo, Jianjun Du, Weiliang Wen, Xianju Lu, Brahmani Louiza. Analysis of the clustering fusion algorithm for multi-band color image. Discrete and Continuous Dynamical Systems - S, 2019, 12 (4&5) : 1233-1249. doi: 10.3934/dcdss.2019085

[13]

Mustaffa Alfatlawi, Vaibhav Srivastava. An incremental approach to online dynamic mode decomposition for time-varying systems with applications to EEG data modeling. Journal of Computational Dynamics, 2020, 7 (2) : 209-241. doi: 10.3934/jcd.2020009

[14]

Daniel Roggen, Martin Wirz, Gerhard Tröster, Dirk Helbing. Recognition of crowd behavior from mobile sensors with pattern analysis and graph clustering methods. Networks and Heterogeneous Media, 2011, 6 (3) : 521-544. doi: 10.3934/nhm.2011.6.521

[15]

Raluca Felea, Romina Gaburro, Allan Greenleaf, Clifford Nolan. Microlocal analysis of borehole seismic data. Inverse Problems and Imaging, , () : -. doi: 10.3934/ipi.2022026

[16]

Tsuguhito Hirai, Hiroyuki Masuyama, Shoji Kasahara, Yutaka Takahashi. Performance analysis of large-scale parallel-distributed processing with backup tasks for cloud computing. Journal of Industrial and Management Optimization, 2014, 10 (1) : 113-129. doi: 10.3934/jimo.2014.10.113

[17]

Claude Carlet, Khoongming Khoo, Chu-Wee Lim, Chuan-Wen Loe. On an improved correlation analysis of stream ciphers using multi-output Boolean functions and the related generalized notion of nonlinearity. Advances in Mathematics of Communications, 2008, 2 (2) : 201-221. doi: 10.3934/amc.2008.2.201

[18]

Pooja Bansal, Aparna Mehra. Integrated dynamic interval data envelopment analysis in the presence of integer and negative data. Journal of Industrial and Management Optimization, 2022, 18 (2) : 1339-1363. doi: 10.3934/jimo.2021023

[19]

George Siopsis. Quantum topological data analysis with continuous variables. Foundations of Data Science, 2019, 1 (4) : 419-431. doi: 10.3934/fods.2019017

[20]

Zhouchen Lin. A review on low-rank models in data analysis. Big Data & Information Analytics, 2016, 1 (2&3) : 139-161. doi: 10.3934/bdia.2016001

 Impact Factor: 

Metrics

  • PDF downloads (98)
  • HTML views (87)
  • Cited by (0)

Other articles
by authors

[Back to Top]