Psience Quest

Full Version: Darwin Unhinged: The Bugs in Evolution
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
(2020-09-11, 08:08 PM)Sciborg_S_Patel Wrote: [ -> ]So if I understand this correctly information structures are not in some Platonic Realm. Information is ultimately within the same world as the physical body, a reference to the structures that can arise, from and be influenced by, lower levels but in turn also exerts top-down causation.

So an animal with a good plan for survival then, by virtue of its mental ability, passes on its genes? So the mutation is still random but the plan is what determines how these mutations are taken advantage of and thus passed down through generations?
Reality consists of the same manifest environment for the physical and the informational -- with the informational level having congress with the future and past.  Manifest events have a probability of one and the physical is only measured in the time frame of the "now".  Empirical measurements are framed in P=1 events.  Informational transforms that are active change with associated information objects in the future and past.  Events as specific groupings of probability can be measured with a degree of uncertainty.  Plans made with structures from the past and probable structures in the future can be more integrated and capable when generated in the now.

Science can measure the addition of information (both as code and as functional instructions) and the amount of it's integration.  The resultant gain in capability is also firmly modeled math equations (Cp/CpK).

I do not imagine so direct a transform of intentions to genetic code.  The "substance" of informational transfer is changes in the probability of outcomes.  A planned response by a single bird or a flock of birds doesn't change genetics, however the appearance of a successful habit, is observable by others of the species and its pursuit in community is an information object in the environment.

Was early man a fire-starter or did he learn to deal with it because it was in his environment?  Was the first bridge engineering?  Or was a log over a creek simply copied!  Is it the tree that is observed by the eye?  Or is it the mind detecting probability wave present in informational reality?
(2020-09-11, 08:50 PM)nbtruthman Wrote: [ -> ]I may not be interpreting your words correctly, but it seems to me that creatures anywhere in the spectrum of complexity from bacteria, to primitive invertebrate metazoans like some sort of proto-trilobite, to hippopotamus-like proto-whales do not have the cognitive resources, the intelligence, to form any sort of imaginative plan for instance on an irreducibly complex mechanism to better obtain food, or a bodily mechanism, a system, to defend against predators, or to adapt to a new environment.  Therefore there is no "information structure" that can be selected for -  such imaginative planning inherently requires a high level of cognitive resources, foresight, and imagination. An old saying: there's no free lunch.
Darwin and his "ism" are supposed to present a mindless evolution of life and random physical events becoming fixed as genes.

As usual in history - victors write the outcome and Darwin was a victor with his theory in name only.  He believed in mind as a major factor in evolution.  Mental Evolution was published by his protege G. Romanes, with Darwin penning the introduction.  Darwin died and Romanes was excluded from the mainstream.  Darwin's careful work on mind seems lost, if you listen to the propagandists.

Bacteria and single cell organisms are still highly active information processing entities.  My poor prose may better be understood with the following analysis of C. Darwin.  
Quote:Although the first dawnings of intelligence, according to Mr. Herbert Spencer,[4] have been developed through the multiplication and co-ordination of reflex actions, and although many of the simpler instincts graduate into reflex actions, and can hardly be distinguished from them, as in the case of young animals sucking, yet the more complex instincts seem to have originated independently of intelligence. I am, however, very far from wishing to deny that instinctive actions may lose their fixed and untaught character, and be replaced by others performed by the aid of the free will.  On the other hand, some intelligent actions, after being performed during several generations, become converted into instincts and are inherited, as when birds on oceanic islands learn to avoid man. These actions may then be said to be degraded in character, for they are no longer performed through reason or from experience. But the greater number of the more complex instincts appear to have been gained in a wholly different manner, through the natural selection of variations of simpler instinctive actions. Such variations appear to arise from the same unknown causes acting on the cerebral organisation, which induce slight variations or individual differences in other parts of the body; and these variations, owing to our ignorance, are often said to arise spontaneously.
https://en.wikisource.org/wiki/The_Desce...hapter_III

Ok  - Here is what I read:
  • Darwin's theory has active features where mind alters purposeful behavior.
  • Darwin believes that free-will is one of these active features.
  • Darwin thinks that conscious intelligent actions can be reduced to subconscious structures called instincts; and persist without being a focus of conscious experience.
  • Darwin thinks that complex instincts evolve from simpler ones and that complexity can be creatively selected from the database of prior instinctual structures.
  • And the killer app for my theory of mind is Darwin's claim that: "unknown causes" (mental processing of information objects for me) induce an increase in real-world probability of variation in physical structures.  That is NOT random evolution, but spells out a channel for mind to play a role.
For a complete review of this assertion {non-random evolution} see: Darwin in the Genome by Lynn Caporale. 

Who would of thunk that charlie d could have presented many of the best arguments for mind having an active role and may be the root of evolution.  This is a bio-evolutionary outlook that Dawkins and Dennett have spent lifetimes trying to deny (and falsely represent some aspects of Darwin's work, imho)

Quote: Caporale presents examples of both non-random and large-scale genomic changes. She describes, for example, how mutational hot spots in genes for vertebrate antibodies can enhance the capabilities of our immune system and how similar hot spots in cone snail toxin genes expand their arsenal of toxic weaponry. Caporale argues that some DNA sequences are more prone to mutational events because of their chemical nature and the biochemistry of DNA replication machinery. She points out that blocks of genetic information can be shuffled within a genome and even passed to the genome of another species. The strength of her book is in collecting and detailing relevant examples from the literature. She maintains throughout that not all mutations are random and that "focused, regulated variation is biochemically possible."

Caporale's idea of "variation-targeting mechanisms" has been criticized for implying foresight in the selection process. She argues,however, that naturalistic mechanisms can explain what appears to be directed purposeful mutation. Caporale offers an approach to working out the molecular and biochemical details, and challenges us to consider the idea that the mechanisms for generating genetic diversity can themselves evolve.
https://ncse.ngo/review-darwin-genome
Some stuff I read some time back from Gordon White on ID and the "neighbors" - like any theory it's insanely speculative blue sky thinking for sure.

Still, I found the articles interesting in that they provide a detailed alternative to usually assumed idea that the Creator of Reality weighted the dice of mutation on a single planet.

When We Met The Neighbours

Quote:Symbolic thinking prior to the Neolithic Renaissance and symbolic thinking afterward is the difference between a toddler banging on a saucepan and the BBC Concert Orchestra playing the theme from Star Wars at Royal Albert Hall for some (awesome) reason. It’s as if we had been sporadically prank dialling the same number for tens of thousands of years.

And then one day somebody picked up...

...A bit further on, [Jeremy Narby] the author [of Cosmic Serpent]’s guide, Carlos, refers to the spirits as “like radio waves” and with the right medicines such as ayahuasca and tobacco, you can tune into them. The spirits in question are mankind’s first civilising gods, the maninkari; literally “those who are hidden”.

Carlos also referred to invisible beings, called maninkari, who are found in animals, plants, mountains, streams, lakes and certain crystals, and who are the source of all knowledge: “The maninkari taught us how to spin and weave cotton, and how to make clothes. Before, our ancestors lived naked in the forest. Who else could have taught us to weave? That is how our intelligence was born, and that is how we natives of the forest know how to weave.”

This particular rant boiled down to its essence: tens of thousands of years ago, certain naturally occurring substances put us in contact with beings that gave us the knowledge and tech to kickoff this whole project of human civilisation.

You are Made of Books

Quote:So there is a ten-atom-wide, universal coding language inside every living thing that suddenly appeared one day a few billion years ago, whose double-helix shape was probably discovered while high on LSD, that absolutely refuses to be replicated in lab settings.

Don’t listen if they say it has been. Balls of fatty acid built on PNA rather than DNA, splicing short genomes into empty bacterial cells… this is just moving the building blocks around a bit and saying you’ve discovered where building blocks come from. I call shenanigans on that! Clearly it isn’t just pseudoscience that speculates beyond the data.

To quote New York University chemistry professor, Robert Shapiro, on the most famous ‘recreating DNA’ experiment from the University of Chicago… it’s like accidentally producing the phrase to be by banging randomly on a keyboard. It doesn’t necessarily mean that the rest of Hamlet is going to follow. “Any sober calculation of the odds reveals that the chances of producing a play or even a sonnet in this way are hopeless, even if every atom of material on earth were a typewriter that had been turning out text without interruption for the last four and a half billion years.”

This is old so not sure no one has replicated DNA in the lab, by which I mean something more than just what he complains about above. Continuing:

Quote:Based on the sheer vastness of DNA, based on the impossible conditions that surround its alleged terrestrial development, based on the discovery of giant, ancient, super-viruses that share genetic traits with all creation and can survive in space it is my contention that LUCA either is or quickly descended from an alien virus that was sent with all the code it needed to unpack and terraform this planet.

It’s my contention that Mimi is probably a spaceship.

Quote:Shamanism, Hermeticism, Neoplatonism, Gnosticism, Buddhism, and so on… you have a ‘real’ world that is somehow the progenitor of the physical world.

These meat suits and their surrounds are variously described as a prison, as imperfect copies of perfect ideas, as dangerous illusions, as a physical expression of an idea in the mind of god, as a way for the creator to come to know itself through physical manifestation. One way or another there is -if not artificiality- then intentionality.

If you were to cast your eyes about looking for how this construct may be manifest, what the blueprints for this prison/playground of matter might look like… then surely DNA is on your shortlist?

Quote:But The Rainbow Serpent, Hermes’s caduceus, Damballa, Tiamat, the dragons of creation, the Serpent of Eden, even the description of Hadit in the Book of The Law… something is up here. Beings that existed before or at the point of creation, in some cases (Tiamat, The Rainbow Serpent, Damballa sometimes) actually cause life to be built...

...Like a lot of the universal flood myths that point to a shared experience of the end of the Pleistocene, these stories are examples of the famous “blind men and the elephant”, each fumbling to mythically convey this one, beautiful idea: that the building blocks of physical creation display an intentionality and that sporadically we get a glance at the blueprint.

I will say it's unclear where the panspermia intersects with the spirits. OTOH this is more in-depth speculation about the designers than I've ever seen from most in the ID field.
Considering the hostility here by some parties toward the Discovery Institute due to its Christian background, I thought it would be instructive to look at a very recent breakthrough in professional publication of an ID-friendly paper in a leading evolutionary biology journal. The authors follow a line of thinking very much based on concepts espoused by the DI, that are firmly founded on good science.

The paper is: "Using statistical methods to model the fine-tuning of molecular machines and systems", https://www.sciencedirect.com/science/ar...9320302071 .

Highlights:

- Statistical methods are appropriate for modelling fine-tuning.
- Fine-tuning is detected in functional proteins, cellular networks etc.
- Constants and initial conditions of nature are deliberately tuned.
- Statistical analysis of fine-tuning model some of the categories of design.
- Fine-tuning and design deserve attention in the scientific community.

This is a major peer-reviewed article on fine-tuning in biology that favorably discusses intelligent design. The Journal of Theoretical Biology is a top peer-reviewed science journal.

The article explicitly cites work by Discovery Institute Fellows such as Stephen Meyer, Günter Bechly, Ann Gauger, Douglas Axe, and Robert J. Marks. The article is co-authored by Steinar Thorvaldsen and Ola Hössjer. Hössjer is a professor of mathematical statistics at Stockholm University who is favorable to intelligent design.

Sure enough, after Darwinists discovered the article, they succeeded in obtaining a “disclaimer” from the journal’s editors, who proclaimed their bias against ID.

But the disclaimer actually made publication of the article all the more significant. It meant that the article survived peer-review and was accepted for publication despite the open hostility of the journal’s top editors.
(2020-10-02, 04:04 PM)nbtruthman Wrote: [ -> ]Highlights:

- Statistical methods are appropriate for modelling fine-tuning.
- Fine-tuning is detected in functional proteins, cellular networks etc.
- Constants and initial conditions of nature are deliberately tuned.
- Statistical analysis of fine-tuning model some of the categories of design.
- Fine-tuning and design deserve attention in the scientific community.

Is there a difference between the usual idea of fine tuning and what the paper seems to be talking about?

I figured fine tuning was very small shifts in the values of universal constants makes life as we know about it impossible, and the latter seems to be more based about probabilistic calculations?

Or am I missing something?
(2020-10-02, 07:29 PM)Sciborg_S_Patel Wrote: [ -> ]Is there a difference between the usual idea of fine tuning and what the paper seems to be talking about?

I figured fine tuning was very small shifts in the values of universal constants makes life as we know about it impossible, and the latter seems to be more based about probabilistic calculations?

Or am I missing something?

The paper's approach is along the following lines:

Cosmological fine-tuning is described:

Quote:"Like a Bach fugue, the Universe has a beautiful elegance about it, governed by laws whose mathematical precision is meted out to the metronome of time. These equations of physics are finely balanced, with the constants of nature that underpin the equations tuned to values that allows our remarkable Universe to exist in a form where we, humanity, can study it. A slight change to these constants, and poof, in a puff of gedanken experimentation, we have a cosmos where atoms cease to be, or where planets are unable to form."


In addition to cosmological fine tuning in the laws of physics, the paper explains how biology has also been found to exhibit a high degree of fine-tuning.

Quote:"A major conclusion of our work is that fine-tuning is a clear feature of biological systems. Indeed, fine-tuning is even more extreme in biological systems than in inorganic systems. It is detectable within the realm of scientific methodology. Biology is inherently more complicated than the large-scale universe and so fine-tuning is even more a feature."


The biological fine-tuning covered in the article are in the areas of functional proteins (three-dimensional protein molecular folding), protein complexes, and cellular networks (metazoans).

Cosmological fine-tuning of the laws of physics can be explained generally either by design (which is a direct inference), or by some sort of probabilistic argument involved with the concept of there being a "multiverse". The multiverse argument has severe problems in its validity as science due among other things to its lack of observable evidence, and its unfalsifiability.

The paper then examines how biological fine-tuning is very hard or impossible to explain by undirected Darwinian processes. One of the biggest problems:


Quote:"Achieving (biological) fine-tuning in a conventional Darwinian model: The waiting time problem: ....In the context of living systems, we need to ask the question whether conventional Darwinian mechanisms have the ability to achieve fine-tuning during a prescribed period of time. This is of interest in order to correctly interpret the fossil record, which is often interpreted as having long periods of stasis interrupted by very sudden abrupt changes (Bechly and Meyer, 2017). Examples of such sudden changes include the origin of photosynthesis, the Cambrian explosions, the evolution of complex eyes and the evolution of animal flight."

The authors conclude, based on several scientific arguments, that there simply has not been anywhere near the needed extreme length of time to evolve these and other examples by undirected Darwinian processes. The irreducible complexity problem is inherently included, since this is ultimately a probabilistic or statistical likelihood problem. The clear implication is design. Of course, this is not actually spelled out in the conclusions of the paper since that would have definitely precluded it ever being published.
(2020-10-02, 08:56 PM)nbtruthman Wrote: [ -> ]Cosmological fine-tuning of the laws of physics can be explained generally either by design (which is a direct inference), or by some sort of probabilistic argument involved with the concept of there being a "multiverse". The multiverse argument has severe problems in its validity as science due among other things to its lack of observable evidence, and its unfalsifiability.

The paper then examines how biological fine-tuning is very hard or impossible to explain by undirected Darwinian processes. One of the biggest problems:

So Cosmological Fine Tuning is different then right? I mean there are no probability measures, AFAIK, for a multiverse - And I agree the Multiverse as supposed by the Many Worlds Interpretation is just ridiculous.

Whereas biological fine-tuning is the usual arguments from ID, regarding the probability of certain biological structures arising at a certain time. Or is the paper saying that there is also something else to biological fine-tuning?

I guess I am not clear on why biological fine-tuning was introduced as a term?
(2020-10-02, 11:10 PM)Sciborg_S_Patel Wrote: [ -> ]So Cosmological Fine Tuning is different then right? I mean there are no probability measures, AFAIK, for a multiverse - And I agree the Multiverse as supposed by the Many Worlds Interpretation is just ridiculous.

Whereas biological fine-tuning is the usual arguments from ID, regarding the probability of certain biological structures arising at a certain time. Or is the paper saying that there is also something else to biological fine-tuning?

I guess I am not clear on why biological fine-tuning was introduced as a term?

The authors state:


Quote:"Fine-tuning has received much attention in physics, and it states that the fundamental constants of physics are finely tuned to precise values for a rich chemistry and life permittance. It has not yet been applied in a broad manner to molecular biology. However, in this paper we argue that biological systems present fine-tuning at different levels, e.g. functional proteins, complex biochemical machines in living cells, and cellular networks. This paper describes molecular fine-tuning, how it can be used in biology, and how it challenges conventional Darwinian thinking. We also discuss the statistical methods underpinning fine-tuning and present a framework for such analysis."

...A major conclusion of our work is that fine-tuning is a clear feature of biological systems. Indeed, fine-tuning is even more extreme in biological systems than in inorganic systems. It is detectable within the realm of scientific methodology.”


I think the authors define fine-tuning in essentially the same way whether it is in the laws of physics or in biological systems.    

Fine-tuning implies a design that is optimized to a very high degree of precision. It implies optimization to a very high degree of precision in at least one area or aspect, such that we can find it and realize its fine-tunedness. It does not imply that something is totally physically perfect. (We don’t even have a definition of that against which to judge anything.) So it does not have to be fine-tuned in every possible way.

You apparently consider cosmological fine-tuning to be of a fundamentally different nature than biological fine-tuning. Why? It seems to me that both domains of fine tuning involve statistical and probabilistic calculations.
(2020-10-03, 12:31 AM)nbtruthman Wrote: [ -> ]You apparently consider cosmological fine-tuning to be of a fundamentally different nature than biological fine-tuning. Why? It seems to me that both domains of fine tuning involve statistical and probabilistic calculations.

I guess I don't know if cosmological fine tuning is really about probability estimations in the same way that biological fine-tuning would be.

Here are some examples of the cosmological type that even make it onto Wikipedia:

Quote:
  • N, the ratio of the electromagnetic force to the gravitational force between a pair of protons, is approximately 1036. According to Rees, if it were significantly smaller, only a small and short-lived universe could exist.[15]
  • Epsilon (ε), a measure of the nuclear efficiency of fusion from hydrogen to helium, is 0.007: when four nucleons fuse into helium, 0.007 (0.7%) of their mass is converted to energy. The value of ε is in part determined by the strength of the strong nuclear force.[16] If ε were 0.006, only hydrogen could exist, and complex chemistry would be impossible. According to Rees, if it were above 0.008, no hydrogen would exist, as all the hydrogen would have been fused shortly after the Big Bang. Other physicists disagree, calculating that substantial hydrogen remains as long as the strong force coupling constant increases by less than about 50%.[13][15]
  • Omega (Ω), commonly known as the density parameter, is the relative importance of gravity and expansion energy in the universe. It is the ratio of the mass density of the universe to the "critical density" and is approximately 1. If gravity were too strong compared with dark energy and the initial metric expansion, the universe would have collapsed before life could have evolved. On the other side, if gravity were too weak, no stars would have formed.[15][17]
  • Lambda (Λ), commonly known as the cosmological constant, describes the ratio of the density of dark energy to the critical energy density of the universe, given certain reasonable assumptions such as positing that dark energy density is a constant. In terms of Planck units, and as a natural dimensionless value, the cosmological constant, Λ, is on the order of 10−122.[18] This is so small that it has no significant effect on cosmic structures that are smaller than a billion light-years across. If the cosmological constant were not extremely small, stars and other astronomical structures would not be able to form.[15]
  • Q, the ratio of the gravitational energy required to pull a large galaxy apart to the energy equivalent of its mass, is around 10−5. If it is too small, no stars can form. If it is too large, no stars can survive because the universe is too violent, according to Rees.[15]
  • D, the number of spatial dimensions in spacetime, is 3. Rees claims that life could not exist if there were 2 or 4 dimensions of spacetime nor if any other than 1 time dimension existed in spacetime.[15] However, contends Rees, this does not preclude the existence of ten-dimensional strings.[2]

None of these read like probabilities, in the sense of estimated likelihood. Rather it's more akin to a house of cards, or a game of Jenga.

There does seem to be a question of "Improbability" with regard to carbon & oxygen levels:


Quote:Improbability and fine-tuning

Main article: Fine-tuned universe

Carbon is a necessary component of all known life. 12C, a stable isotope of carbon, is abundantly produced in stars due to three factors:

  1. The decay lifetime of a 8Be nucleus is four orders of magnitude larger than the time for two 4He nuclei (alpha particles) to scatter.[17]
  2. An excited state of the 12C nucleus exists a little (0.3193 MeV) above the energy level of 8Be + 4He. This is necessary because the ground state of 12C is 7.3367 MeV below the energy of 8Be + 4He. Therefore, a 8Be nucleus and a 4He nucleus cannot reasonably fuse directly into a ground-state 12C nucleus. The excited Hoyle state of 12C is 7.656 MeV above the ground state of 12C. This allows 8Be and 4He to use the kinetic energy of their collision to fuse into the excited 12C, which can then transition to its stable ground state. According to one calculation, the energy level of this excited state must be between about 7.3 and 7.9 MeV to produce sufficient carbon for life to exist, and must be further "fine-tuned" to between 7.596 MeV and 7.716 MeV in order to produce the abundant level of 12C observed in nature.[18]
  3. In the reaction 12C + 4He → 16O, there is an excited state of oxygen which, if it were slightly higher, would provide a
    resonance and speed up the reaction. In that case, insufficient carbon would exist in nature; almost all of it would have converted to oxygen.[17]
Some scholars argue the 7.656 MeV Hoyle resonance, in particular, is unlikely to be the product of mere chance. Fred Hoyle argued in 1982 that the Hoyle resonance was evidence of a "superintellect";[12] Leonard Susskind in The Cosmic Landscape rejects Hoyle's intelligent design argument.[19] Instead, some scientists believe that different universes, portions of a vast "multiverse", have different fundamental constants:[20] according to this controversial fine-tuning hypothesis, life can only evolve in the minority of universes where the fundamental constants happen to be fine-tuned to support the existence of life. Other scientists reject the hypothesis of the multiverse on account of the lack of independent evidence.[21]


But even here, we aren't talking about likelihood estimates? The improbability is that all these variables can line up so well that we achieve life. As we agree the very fact that the Multiverse has to be summoned to try and dispel the problem shows it's a sore point for at least some of the materialist-atheist evangelicals trying to patch up their faith.

Maybe I'm missing something in the paper, but it seems that the biological fine-tuning mentioned there involves probability calculations where the likelihoods themselves are up for debate. The paper even says this about cosmological fine tuning:


Quote:A probabilistic argument presumes adequate knowledge of (the limits on) the space of possibility. It presupposes that current knowledge provides an accurate, unbiased statistical account of, or means of determining, what may or may not happen by chance. As Colyvan et al., 2005, Dembski, 2014 have argued, the fine-tuning argument for our universe is not a strict statistical argument, since it involves features that need to be in place before the universe can be said to exist and operate. And there is no way of assigning a probability distribution as reference associated with the universe in that early stage. Probabilities for the initial formation of the universe are by its nature independent of known processes operating in our present universe, i.e. “gedanken probabilities”.


Then it goes on to say:


Quote:In this section, we will present and discuss some relevant observations from experimental biology. This will be done in the light of the theory of stochastic models, outlined in Section 2. More specifically, we will identify events  whose probability  is very low under naturalistic stochastic models, and argue that these represent extreme examples of fine-tuning.


What then follows is a complex argument about probabilities. Which is fine, I'm not against this type of argument, but it seems there's a variety of factors that go into estimating the odds which I imagine skeptics would argue against.

Just strikes me as something different than Cosmological fine tuning. What I would think of as fine tuning at the biology level is if some quantum level property is both necessary for life and said property leaves little room for deviation in the values it takes on.
(2020-10-03, 12:31 AM)nbtruthman Wrote: [ -> ]I think the authors define fine-tuning in essentially the same way whether it is in the laws of physics or in biological systems.    

Fine-tuning implies a design that is optimized to a very high degree of precision. It implies optimization to a very high degree of precision in at least one area or aspect, such that we can find it and realize its fine-tunedness. 
First, I want to say the paper exhibits a solid dose of professionalism and deserved publication.  It does present facts in a way that I see as positive, addressing the model I support.  As Sci has pointed out, its trying to conflate cosmo and biological fine tuning.

In particular it calls for a design science.  The idea has been in the academic spotlight for more than 30 years.  Don Norman and Bucky Fuller are the key proponents.  The first citation below, addresses the exact means for "biological fine-tuning".  I assert this means is via mental causation as an active result of organism/environmental adaptation.

Quote: Psychologist James Gibson coined “affordance” in 1977, referring to all action possibilities with an object based on users’ physical capabilities. For instance, a chair affords sitting on, standing on, throwing, etc. Human-computer interaction (HCI) expert Don Norman later (1988) defined affordances as perceivable action possibilities – i.e., only actions which users consider possible. So, designers must create objects’ affordances to conform to users’ needs based on these users’ physical capabilities, goals and past experiences. Clear affordances are vital to usability. Users will map the possibilities of what an object does according to their conceptual model of what that object should do (e.g., inserting fingers into scissor holes to cut things). 
 https://www.interaction-design.org/liter...ffordances

And one of my favorite thinkers - 

Quote: 
DESIGN SCIENCE

Amy Edmondson
"I did not set out to design a geodesic dome," Fuller once said, "I set out to discover the principles operative in Universe. For all I knew, this could have led to a pair of flying slippers."
 https://www.bfi.org/design-science/primer

In cosmological fine tuning - the mental is not acknowledged as an active agent.  The role of mind in biological adaptation already is.