eISSN:
 2639-8001

All Issues

Volume 3, 2021

Volume 2, 2020

Volume 1, 2019

FoDS Flyer: showing all essential information of the journal.
Foundations of Data Science (FoDS) invites submissions focusing on advances in mathematical, statistical, and computational methods for data science. Results should significantly advance current understanding of data science, by algorithm development, analysis, and/or computational implementation which demonstrates behavior and applicability of the algorithm. Fields covered by the journal include, but are not limited to Bayesian Statistics, High Performance Computing, Inverse Problems, Data Assimilation, Machine Learning, Optimization, Topological Data Analysis, Spatial Statistics, Nonparametric Statistics, Uncertainty Quantification, and Data Centric Engineering. Expository and review articles are welcome. Papers which focus on applications in science and engineering are also encouraged, however the method(s) used should be applicable outside of one specific application domain.

Call for Papers Special Issue "Data Science Education Research" of Foundations of Data Science (click to view details)

Call for Papers Special Issue "Topological methods in data analysis, machine learning and artificial intelligence" of Foundations of Data Science (click to view details)

Call for Papers Special Issue "Data Assimilation" of Foundations of Data Science (click to view details)

Call for Papers Special Issue "Sequential Monte Carlo Methods" of Foundations of Data Science (click to view details)

Select all articles

Export/Reference:

Wave-shape oscillatory model for nonstationary periodic time series analysis
Yu-Ting Lin, John Malik and Hau-Tieng Wu
2021, 3(2) : 99-131 doi: 10.3934/fods.2021009 +[Abstract](1081) +[HTML](273) +[PDF](10367.19KB)
On the linear ordering problem and the rankability of data
Thomas R. Cameron, Sebastian Charmot and Jonad Pulaj
2021, 3(2) : 133-149 doi: 10.3934/fods.2021010 +[Abstract](459) +[HTML](248) +[PDF](1569.13KB)
Normalization effects on shallow neural networks and related asymptotic expansions
Jiahui Yu and Konstantinos Spiliopoulos
2021, 3(2) : 151-200 doi: 10.3934/fods.2021013 +[Abstract](362) +[HTML](182) +[PDF](2668.83KB)
Geometric adaptive Monte Carlo in random environment
Theodore Papamarkou, Alexey Lindo and Eric B. Ford
2021, 3(2) : 201-224 doi: 10.3934/fods.2021014 +[Abstract](425) +[HTML](183) +[PDF](5158.46KB)
Spectral clustering revisited: Information hidden in the Fiedler vector
Adela DePavia and Stefan Steinerberger
2021, 3(2) : 225-249 doi: 10.3934/fods.2021015 +[Abstract](339) +[HTML](168) +[PDF](2055.16KB)
A Bayesian multiscale deep learning framework for flows in random media
Govinda Anantha Padmanabha and Nicholas Zabaras
2021, 3(2) : 251-303 doi: 10.3934/fods.2021016 +[Abstract](462) +[HTML](300) +[PDF](30370.99KB)
Online learning of both state and dynamics using ensemble Kalman filters
Marc Bocquet, Alban Farchi and Quentin Malartic
2020doi: 10.3934/fods.2020015 +[Abstract](1352) +[HTML](567) +[PDF](789.8KB)
Ensemble Kalman Inversion for nonlinear problems: Weights, consistency, and variance bounds
Zhiyan Ding, Qin Li and Jianfeng Lu
2020doi: 10.3934/fods.2020018 +[Abstract](1097) +[HTML](446) +[PDF](1612.75KB)
An international initiative of predicting the SARS-CoV-2 pandemic using ensemble data assimilation
Geir Evensen, Javier Amezcua, Marc Bocquet, Alberto Carrassi, Alban Farchi, Alison Fowler, Pieter L. Houtekamer, Christopher K. Jones, Rafael J. de Moraes, Manuel Pulido, Christian Sampson and Femke C. Vossepoel
2020doi: 10.3934/fods.2021001 +[Abstract](1674) +[HTML](558) +[PDF](20192.86KB)
Mean field limit of Ensemble Square Root filters - discrete and continuous time
Theresa Lange and Wilhelm Stannat
2021doi: 10.3934/fods.2021003 +[Abstract](772) +[HTML](376) +[PDF](434.65KB)
The (homological) persistence of gerrymandering
Moon Duchin, Tom Needham and Thomas Weighill
2021doi: 10.3934/fods.2021007 +[Abstract](780) +[HTML](346) +[PDF](23416.84KB)
Intrinsic disease maps using persistent cohomology
Daniel Amin and Mikael Vejdemo-Johansson
2021doi: 10.3934/fods.2021008 +[Abstract](555) +[HTML](276) +[PDF](720.82KB)
Iterative ensemble Kalman methods: A unified perspective with some new variants
Neil K. Chada, Yuming Chen and Daniel Sanz-Alonso
2021doi: 10.3934/fods.2021011 +[Abstract](448) +[HTML](200) +[PDF](1587.31KB)
A density-based approach to feature detection in persistence diagrams for firn data
Austin Lawson, Tyler Hoffman, Yu-Min Chung, Kaitlin Keegan and Sarah Day
2021doi: 10.3934/fods.2021012 +[Abstract](363) +[HTML](256) +[PDF](5777.15KB)
Homotopy continuation for the spectra of persistent Laplacians
Xiaoqi Wei and Guo-Wei Wei
2021doi: 10.3934/fods.2021017 +[Abstract](250) +[HTML](191) +[PDF](1958.0KB)
Feedback particle filter for collective inference
Jin-Won Kim, Amirhossein Taghvaei, Yongxin Chen and Prashant G. Mehta
2021doi: 10.3934/fods.2021018 +[Abstract](176) +[HTML](90) +[PDF](454.82KB)
A surrogate-based approach to nonlinear, non-Gaussian joint state-parameter data assimilation
John Maclean and Elaine T. Spiller
2021doi: 10.3934/fods.2021019 +[Abstract](231) +[HTML](101) +[PDF](6047.32KB)
Learning landmark geodesics using the ensemble Kalman filter
Andreas Bock and Colin J. Cotter
2021doi: 10.3934/fods.2021020 +[Abstract](218) +[HTML](73) +[PDF](2462.92KB)
ToFU: Topology functional units for deep learning
Christopher Oballe, David Boothe, Piotr J. Franaszczuk and Vasileios Maroulas
2021doi: 10.3934/fods.2021021 +[Abstract](92) +[HTML](118) +[PDF](1001.97KB)
A study of disproportionately affected populations by race/ethnicity during the SARS-CoV-2 pandemic using multi-population SEIR modeling and ensemble data assimilation
Emmanuel Fleurantin, Christian Sampson, Daniel Paul Maes, Justin Bennett, Tayler Fernandes-Nunez, Sophia Marx and Geir Evensen
2021doi: 10.3934/fods.2021022 +[Abstract](108) +[HTML](29) +[PDF](22126.08KB)
Analysis of the feedback particle filter with diffusion map based approximation of the gain
Sahani Pathiraja and Wilhelm Stannat
2021doi: 10.3934/fods.2021023 +[Abstract](41) +[HTML](16) +[PDF](457.29KB)
Stability of non-linear filter for deterministic dynamics
Anugu Sumith Reddy and Amit Apte
2021doi: 10.3934/fods.2021025 +[Abstract](33) +[HTML](23) +[PDF](991.99KB)
Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization
Marc Bocquet, Julien Brajard, Alberto Carrassi and Laurent Bertino
2020, 2(1) : 55-80 doi: 10.3934/fods.2020004 +[Abstract](3368) +[HTML](1091) +[PDF](800.0KB) Cited By(10)
Consistent manifold representation for topological data analysis
Tyrus Berry and Timothy Sauer
2019, 1(1) : 1-38 doi: 10.3934/fods.2019001 +[Abstract](4375) +[HTML](1994) +[PDF](3141.49KB) Cited By(3)
Semi-supervised classification on graphs using explicit diffusion dynamics
Robert L. Peach, Alexis Arnaudon and Mauricio Barahona
2020, 2(1) : 19-33 doi: 10.3934/fods.2020002 +[Abstract](1984) +[HTML](994) +[PDF](347.25KB) Cited By(3)
Accelerating Metropolis-Hastings algorithms by Delayed Acceptance
Marco Banterle, Clara Grazian, Anthony Lee and Christian P. Robert
2019, 1(2) : 103-128 doi: 10.3934/fods.2019005 +[Abstract](3591) +[HTML](1907) +[PDF](685.26KB) Cited By(2)
Power weighted shortest paths for clustering Euclidean data
Daniel Mckenzie and Steven Damelin
2019, 1(3) : 307-327 doi: 10.3934/fods.2019014 +[Abstract](2060) +[HTML](1094) +[PDF](663.53KB) Cited By(2)
Partitioned integrators for thermodynamic parameterization of neural networks
Benedict Leimkuhler, Charles Matthews and Tiffany Vlaar
2019, 1(4) : 457-489 doi: 10.3934/fods.2019019 +[Abstract](2277) +[HTML](1029) +[PDF](10550.03KB) Cited By(2)
Learning by active nonlinear diffusion
Mauro Maggioni and James M. Murphy
2019, 1(3) : 271-291 doi: 10.3934/fods.2019012 +[Abstract](2748) +[HTML](1117) +[PDF](4001.74KB) Cited By(2)
Levels and trends in the sex ratio at birth and missing female births for 29 states and union territories in India 1990–2016: A Bayesian modeling study
Fengqing Chao and Ajit Kumar Yadav
2019, 1(2) : 177-196 doi: 10.3934/fods.2019008 +[Abstract](3055) +[HTML](1134) +[PDF](2577.91KB) Cited By(2)
Multi-fidelity generative deep learning turbulent flows
Nicholas Geneva and Nicholas Zabaras
2020, 2(4) : 391-428 doi: 10.3934/fods.2020019 +[Abstract](1288) +[HTML](457) +[PDF](14569.45KB) Cited By(1)
On the incorporation of box-constraints for ensemble Kalman inversion
Neil K. Chada, Claudia Schillings and Simon Weissmann
2019, 1(4) : 433-456 doi: 10.3934/fods.2019018 +[Abstract](1661) +[HTML](991) +[PDF](1289.35KB) Cited By(1)
Issues using logistic regression with class imbalance, with a case study from credit risk modelling
Yazhe Li, Tony Bellotti and Niall Adams
2019, 1(4) : 389-417 doi: 10.3934/fods.2019016 +[Abstract](2956) +[HTML](1049) +[PDF](4084.46KB) PDF Downloads(2779)
Bayesian inference of chaotic dynamics by merging data assimilation, machine learning and expectation-maximization
Marc Bocquet, Julien Brajard, Alberto Carrassi and Laurent Bertino
2020, 2(1) : 55-80 doi: 10.3934/fods.2020004 +[Abstract](3368) +[HTML](1091) +[PDF](800.0KB) PDF Downloads(789)
Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems
Feng Bao and Thomas Maier
2020, 2(1) : 1-17 doi: 10.3934/fods.2020001 +[Abstract](2408) +[HTML](1074) +[PDF](418.16KB) PDF Downloads(540)
Probabilistic learning on manifolds
Christian Soize and Roger Ghanem
2020, 2(3) : 279-307 doi: 10.3934/fods.2020013 +[Abstract](1839) +[HTML](933) +[PDF](722.55KB) PDF Downloads(449)
Semi-supervised classification on graphs using explicit diffusion dynamics
Robert L. Peach, Alexis Arnaudon and Mauricio Barahona
2020, 2(1) : 19-33 doi: 10.3934/fods.2020002 +[Abstract](1984) +[HTML](994) +[PDF](347.25KB) PDF Downloads(404)
Accelerating Metropolis-Hastings algorithms by Delayed Acceptance
Marco Banterle, Clara Grazian, Anthony Lee and Christian P. Robert
2019, 1(2) : 103-128 doi: 10.3934/fods.2019005 +[Abstract](3591) +[HTML](1907) +[PDF](685.26KB) PDF Downloads(385)
Consistent manifold representation for topological data analysis
Tyrus Berry and Timothy Sauer
2019, 1(1) : 1-38 doi: 10.3934/fods.2019001 +[Abstract](4375) +[HTML](1994) +[PDF](3141.49KB) PDF Downloads(377)
Modelling dynamic network evolution as a Pitman-Yor process
Francesco Sanna Passino and Nicholas A. Heard
2019, 1(3) : 293-306 doi: 10.3934/fods.2019013 +[Abstract](2567) +[HTML](1261) +[PDF](1164.04KB) PDF Downloads(359)
Multi-fidelity generative deep learning turbulent flows
Nicholas Geneva and Nicholas Zabaras
2020, 2(4) : 391-428 doi: 10.3934/fods.2020019 +[Abstract](1288) +[HTML](457) +[PDF](14569.45KB) PDF Downloads(337)
Quantum topological data analysis with continuous variables
George Siopsis
2019, 1(4) : 419-431 doi: 10.3934/fods.2019017 +[Abstract](2230) +[HTML](1362) +[PDF](1473.63KB) PDF Downloads(312)

Editors

Referees

Librarians

Email Alert

[Back to Top]