I think Adam Frank's approach has to be right for me, though I would go a lot further!
As I wrote in Brian's Big Bang thread, I just think science (and especially science theory) has exploded into realms where it should not operate.
Quote:This is exactly what happens with “eternal inflation” and the Multiverse. A theory we understand in one regime (much lower energy particle accelerators) gets stretched into a very different one (10^(-36) of a second after the Big Bang). That extrapolation solves some problems (but not others), but it all comes at a strange cost. That cost is what I call “ontological exuberance.”
As Adam points out, the BB only seems to 'work' if you keep on piling in additional outlandish assumptions - the concept of an inflation force (utterly contrary to experience) and then the mutliverse.
Maybe part of the problem is that in 'ordinary' physics, everyone knows that equations can't be extrapolated to far, or they break down. For example, the famous gas laws encapsulated in PV=nRT are only approximate. It is only in High Energy Physics and cosmology that the equations are supposed to be exact - so that any singularities that appear must really be points in space-time where the laws of physics completely break down!
In addition, there are shaky pieces much further back. 'Hubble's Law' that states that galaxies recede from us at a speed proportional to their distance, is basically a huge assumption, which was seriously challenged by Halton Arp, a well respected astronomer, who fell from grace when he collected evidence that quasars - supposedly the most distant objects in the universe - were associated with much nearer galaxies:
https://www.youtube.com/watch?v=EckBfKPAGNM
If his conclusion were true, cosmology would basically implode because we would not know the absolute distance of anything outside our local group of galaxies. Needless to say, his work was derided, and I am not sure it is being pursued at all now that he is dead.
Possibly another example is the fact that when galaxies are observed, they simply do not rotate in the way they would if Einsteinian Relativity were in operation. The difference is quite stark, and lead to the utterly ad hoc notion that 96% of the universe is made of 'dark matter'! Wouldn't it be more honest to propose that GR may only operate up to a certain range before other forces dominate.
Then there has to be doubt about some of the experiments themselves. The EU paid a huge chunk of the cost of the LHC, and what do we have for all that money - supposedly the Higgs Boson! Alexander Uzzicker wrote a book about this discovery:
https://www.amazon.co.uk/s?k=the+higgs+f...nb_sb_noss
Now Uzzicker comes across as fairly intellectually aggressive, and he doesn't seem to have the qualifications in the subject, but he makes a lot of telling observations. One that I particularly like, is the fact that the data flows from this experiment at such a ludicrously high rate that the only way to collect it is using specially designed hardware which filters the data on the fly (the complete data set is too big to be stored) - so we have no record of what was actually detected except for the output from this filtration process. What could possibly go wrong? He estimates that 10^12 collisions have to be processed for every Higgs particle detected! I have seen objections to that figure, but to my knowledge no better estimate has been made.
Once I understood this process, I mentally dismissed this supposed triumph of modern physics.
I mean, the problem is that one dodgy conclusion gets baked into someone else's theory, which then gets baked into yet more theory - the result is a total disaster.