> consider what the result of measuring anything to an infinite precision could possibly look like. It would require somehow recording an infinite amount of information
This is Zeno's dichotomy paradox [1]. Finitely-defined infinitely-complex systems (e.g. fractals and anything chaos theory) are the escape.
Division? Our productorial sect believes the universe does not have such a useless operation. Multiplication by fractions is the true implementation of the NaN one!
> a much simpler escape: That space is ultimately discrete (i.e. that there's an elementary length) rather than infinitely continuous
Sure. The point is the gedankenexperiment proves nothing. We don't need to "[record] an infinite amount of information" to encapsulate the infinity between any pair of real numbers.
More expensive, harder to maintain and keep current, and a tangent to the core matter; Starlink satellites leak radiation and could be shielded, Starlink satellites could be {switched off | turned low} off over Quiet Zones but are not.
Statements were made that shielding would improve after Ver 1.0 .. it got worse. Statements were made that sats would go low power over quiet zones, they do not.
Returning to your erudite point "and stuff"
The NASA Cosmic Background Explorer (COBE) satellite orbited Earth in 1989–1996 ...
Inspired by the COBE results, a series of ground and balloon-based experiments measured cosmic microwave background anisotropies on smaller angular scales ...
The sensitivity of the new experiments improved dramatically, with a reduction in internal noise by three orders of magnitude.
As of 2024, Xuntian is scheduled for launch no earlier than late 2026 on a Long March 5B rocket to co-orbit with the Tiangong space station in slightly different orbital phases, which will allow for periodic docking with the station.
Leaving aside the fact that an optical telescope isn't a microwave array nor is it a Square Kilomtre Array of radio telescopes with each component larger than your example ...
Putting an instrument in orbit has all the costs of development of a ground based instrument, additional costs to space harden and test, additional costs to lift, limited ability to tune, tweak or extend when in orbit, hard constraints on size and weight, and other issues.
Xuntian allows for periodic docking, sure. How will this not be more expensive and limited than (say) walking | driving out daily or weekly to much lager instruments on the ground?
This band of electromagnetic radiation lies within the transition region between microwave and far infrared, and can be regarded as either. )
> Putting an instrument in orbit has all the costs of development of a ground based instrument, additional costs to space harden and test, additional costs to lift, limited ability to tune, tweak or extend when in orbit, hard constraints on size and weight, and other issues.
Who is to say they won't pull a Space-X, maybe even overtaking it, going fully reusable?
Which allegedly lowers the costs, giving more economically access to space and lessening the constraints on payloads, while giving all the advantages of being in space?
So .. all the cost, time, and resources of building an instrument on the ground.
With the additional cost of lifting it to orbit, the additional cost of difficulty of in orbit maitainaince, and the additional weight and dimension restraints of going to orbit, the additional costs of over designing to harden for space and limited access.
Yes, there are advantages to being in space. They vary by application.
That aside it's still cheaper to build an instrument or instrument array that's deployed on the ground.
Unconvinced. Because building the whole system, not some isolated dishes somewhere amounted to 1.3 Billion EUR, operating it up to 2030 adds another 0.7 Billion EUR. 2 Billions. Chump change for sure.
Now we can compare that with the JWST and typical cost overruns in american boondoggle style, or look at the latest shining star, EUCLID. Just 1.4 Billion EUR for the latter.
Then there was GAIA at about 740 Million EUR, with the orbiting article at 450 Million EUR alone, plus another 250 Million EUR for the data-processing org.
All of these with more or less conventional rocketry, and not co-orbiting anything for more easy maintenance and upgrading.
My gut feeling tells me we will have cheaper and more reliable access to space, with larger payload capacity, necessitating less 'origamics' for the space parts, and that chinese concept seems sound, too. Very much so, in fact.
How much that will cost I have no clue.
But again, if something like this is becoming reality, no matter by whom, some former assumptions about cost, feasibility (at all, because payload weight and dimension constraints are relaxed, needing less 'origamics') will have to be rethought.
That was my point, in general. Not limited to any special application.
Sure, when we get there again, can make regular trips, and have consequence free energy to burn that'd be great.
In the interim, and as a general rule for all private entities, it'd be nice to not pollute the commons with unnecessary discharges and sparkles and to carry through on pinky promises to maybe do something about that.
If it were not so you could encode an arbitrary amount of information into the specific length of a one dimensional object. It would be like a physical Taylor series, but since you can go arbitrarily small you can encode arbitrary coefficients. In fact, if you had a physical disc you could encode everything at every point along its circumference. Which is, like, everything squared or something.
There are a large number of continuous physical quantities, not only length (though all continuous quantities are dependent in one way or another on space or time, which are the primitive continuous quantities), and the reason why you cannot encode an arbitrary amount of information into a specific value of such a quantity is because it is impossible to make an object for which such a quantity would have a perfectly constant value. All the values of such quantities are affected by noise-like variations so you could store information only in the average value of such a quantity, computed over some time and any such average would still be affected by uncertainties that limit the amount of information that can be stored.
One of the most constant lengths that have ever characterized an artificial object has been the length of the international prototype meter kept in France and used to define the meter until 1960. To minimize the length variations, that meter bar was made of platinum-iridium alloy and it was measured at a temperature as constant as possible.
Despite the precautions, which included gentle removing of the dust and handling with soft grippers, the length of that meter bar fluctuated continuously. Even if it was attempted to keep a constant temperature, very small fluctuations in temperature still caused thermal expansions and contractions. Every time the bar was touched, a few metal atoms were removed from it, but other atoms from the environment remained stuck to its surface, changing the length.
All these continuous variations have nothing to do with the possibility of the space being discrete, but they limit the amount of information that can be stored in any such value.
For now there exists absolutely no evidence about the space or time being discrete and not continuous. There have been attempts to make theories based on the discreteness of the space and/or time, but until now they have not provided any useful result.
No, this is a wrong argument. We can specify arbitrarily low temperatures by hypothesis, obviating the objection. If you want to get pedantic you could note that measuring something that is, say, 10^-30m is unphysical - not even laser interferometry gets that small, or anywhere close. However, given that the argument uses a counter-factual, you'd have to extrapolate all the ways that would affect apparatuses.
Instead, my way is simpler by generating an absurd result that if you could build and measure a thing to arbitrary precision you can encode infinite information into it. This is enough for me to reject the counter-factual without going through the messiness of thinking through hypothetical realistic experiments.
The one interesting place to consider is at the Schwarzchild radius of a black hole, where presumably information accumulates to an absurd degree, monotonically over time. I don't really know enough about it to comment intelligently, so I won't except to note its existence.
Another simple escape: space is a mental category. It's not a feature of reality, it's a requirement for representing the world to a conscious subject.
This is Zeno's dichotomy paradox [1]. Finitely-defined infinitely-complex systems (e.g. fractals and anything chaos theory) are the escape.
[1] https://en.wikipedia.org/wiki/Zeno%27s_paradoxes#Dichotomy_p...