Suppose you had a standard Fourier series representing an arbitrary function. Suppose now this function is the periodic movement of some celestial body.
As the following video explains, what the Fourier coefficients stand for are the usual Ptolemaic epicycles.
A paper by Hanson, “Mathematical Power of Epicyclical Astronomy” shows that provided a sufficient number of those frequencies are being employed, a summable function can always be approximated as close as needed.
What becomes now of Popper demarcation problem (“Conjectures and Refutations”,2nd ed.,Ch. 11, “The Demarcation between Science and Metaphysics”)? It is not evident to me that any falsificationist strategy can distinguish this Fourier based approximation algorithm – call it Ptolemaic system – from a Copernican algorithmic strategy. Since Copernicus as well as Aristarchus were right of course, how are we going to proceed? It is a vacuous theory for sure, but is there a metric for how vacuous a theory can be? Popper himself , (“Conjectures and Refutations”,2nd ed. , Ch. 10:xxi) says that “Ptolemy’s system was not refuted when Copernicus produced his”.
Another example is Lagrangian polynomial: for, (wiki) given a set of distinct data points , the interpolating polynomial in the Lagrange form is the linear combination
of Lagrange basis polynomials
That does not capture the Data Generating Process any more than projecting the points into a plane – as a linear regression would do.
How are we then to demarcate (Popper) in this case? Leibnitz had some ideas, later exposed by Weyl (“The Open World”, 1931) and subsequently picked up and generalized by Chaitin and Kolmogorov. You want to base your demarcation strategy on the parsimony of the identified model, à la Ockham. We are then simply begging the question: is there a good metric for parsimony? More on that later on.