Psience Quest

Full Version: Darwin Unhinged: The Bugs in Evolution
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
(2017-12-19, 04:53 PM)stephenw Wrote: [ -> ]I am right there with you and with the quotations that you cited.  I strongly agree that observation of nature leads one to think that information is getting shared at deep levels.

The definition of the mental process, in my humble view, is very simple.  Mind is the overall process of living things processing and transforming both kinds of information, formal bits and logic, as well as functional/meaningful configurations.  So that minds are involved from the faint signals within a cell controlling metabolism -- all the way to herd behavior of a species.  Information that is objective - just like matter and energy.

I think what is crucial to my understanding is the teleology. That means to me that there is a vision with a purpose; a mind that holds that vision. Perhaps the purpose is simply for the mind to know itself by exploring its creativity through all its possibilities - which are probably infinite and so the exploration is never ending. Of course, we should consider that linear time is an illusion so teleology shouldn't be thought of as a progression in time. That's where my human imagination comes up short: I can say that we should consider that but I don't really know how to.

There is also the risk of reductionist thinking when it comes to mind: breaking it down into mind for worms or mind for cells and, ultimately into digital bits. Firstly I maintain that it is all one mind - perhaps applied or manifested in different and individualised ways - but never separated. Secondly there is the temptation to think of things being imbued with mind whereas I believe the reality to be that those things are themselves mental constructs: made of mind stuff. Again, no separation. It is the awesome capacity of mind to create and populate worlds of such diversity and also endow those creations with their own creativity, individually and collectively. 

That's probably an amateurish attempt at a creation story but it makes more sense to me than the anthropomorphic God with His Garden of Eden or the chance collision of molecules in an indifferent, accidental, mindless and purposeless universe.
(2017-12-16, 11:07 PM)Paul C. Anagnostopoulos Wrote: [ -> ]Here is an evolution simulation that I wrote that precisely does do calculations:

https://schneider.ncifcrf.gov/paper/ev/evj/

There is considerable controversy surrounding the claims made for the Ev program. It gets very technical. See articles at https://evolutionnews.org/2010/12/bio-co...co-author/ and https://evolutionnews.org/2016/03/ev_ever_again/. Montanez, Ewert, Marks and Dembski responded to these claims with their paper "A Vivisection of the ev Computer Organism: Identifying Sources of Active Information", at http://bio-complexity.org/ojs/index.php/...O-C.2010.3. From the paper:

Quote:"The success of ev is largely due to active information introduced by the Hamming oracle and from the perceptron structure. It is not due to the evolutionary algorithm used to perform the search.

Indeed, other algorithms are shown to mine active information more efficiently from the knowledge sources provided by ev.

Schneider claims that ev demonstrates that naturally occurring genetic systems gain information by evolutionary processes and that “information gain can occur by punctuated equilibrium”. Our results show that, contrary to these claims, ev does not demonstrate “that biological information…can rapidly appear in genetic control systems subjected to replication, mutation, and selection”. We show this by demonstrating that there are at least five sources of active information in ev.

1. The perceptron structure. The perceptron structure is predisposed to generating strings of ones sprinkled by zeros or strings of zeros sprinkled by ones. Since the binding site target is mostly zeros with a few ones, there is a greater predisposition to generate the target than if it were, for example, a set of ones and zeros produced by the flipping of a fair coin.

2. The Hamming Oracle. When some offspring are correctly announced as more fit than others, external knowledge is being applied to the search and active information is introduced. As with the child’s game, we are being told with respect to the solution whether we are getting “colder” or “warmer”.

3. Repeated Queries. Two queries contain more information than one. Repeated queries can contribute active information.

4. Optimization by Mutation. This process discards mutations with low fitness and propagates those with high fitness. When the mutation rate is small, this process resembles a simple Markov birth process that converges to the target.

5. Degree of Mutation. As seen in Figure 3, the degree of mutation for ev must be tuned to a band of workable values."

Schneider attempted to rebut this paper in a short article, at https://schneider.ncifcrf.gov/paper/ev/d...s2010.html.

Robert Marks' response to Schneider's response is at http://evoinfo.org/papers/autopsy.pdf. His conclusion:


Quote:"While we appreciate Schneider’s correction on some typographical errors, the results of the paper still hold. We agree with Schneider that information is gained by the genome through extraction of it from the environment. However, we show that information sources must exist which allow access to that information. Schneider has assumed two powerful sources of information within ev, the perceptron structure and the hamming distance oracle. These information sources are the cause behind the gain in information, not 
evolution or the mere existence of information in the environment. The information in the environment takes the form of a point of high fitness in a fitness landscape. However, the mere existence of that point does not allow reaching it. Reaching that point requires a particular shape to the fitness landscape to guide evolution."
(2017-12-13, 10:09 PM)Paul C. Anagnostopoulos Wrote: [ -> ]Ah, okay, so we aren't talking about the flagellum as it stands, but about a "conserved core" of proteins. I'm happy to stipulate that if you remove all the proteins that aren't absolutely necessary, you will indeed end up with a subset that are necessary for motility. This would be true of all existing mechanisms, natural or human-designed. Water is irreducibly complex, as long as we are talking about original function.

This explicitly admits to the problem of IC. It's not just any additional necessary (including accessory or interfacing) parts of the cellular apparatus we are talking about, it's the core components of the flagellar machine. Are you claiming that any of the core components (filament, hook, and basal body/stator/rotor assembly) can be removed without the flagellum losing its propulsive function?  

Quote:Actually, it does refute the IC of actual flagella.
 
Please explain how the paper you posted (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1943423/, title The Protein Network of Bacterial Motility) refutes the irreducible complexity of the bacterial flagellum. Here I refer to the original definition of IC in Darwin's Black Box, which concentrates on single, coherent systems. I don't see how the paper has much to do with IC. If anything the extra proteins that affect mobility show the problem is worse than just needing the core proteins (like needing equipment to manufacture gasoline so that an outboard motor can work). The large number of different gene deletions having reduced or no motility has nothing to do with the irreducible complexity of the basic flagellar mechanism. This IC was demonstrated by Scott Minnich's gene knockout experiments. Most of these 159 gene deletion strains described in the paper probably interfered with motility by interfering with or deleting other interfacing components of the cell, rather than by directly disrupting or deleting the core components of the flagellum.

Quote:You'll notice that Behe's original definition is not the one used today.

I'm not aware of of anything in print by Michael Behe in which he formally substitutes in an actual book or paper a new operational definition of IC replacing the one in Darwin's Black Box (other than some theorizing about problems of the need for multiple unselected mutations, in The Edge of Evolution). The core of the different versions remains the observation that there are certain biological machines where the removal of any part causes the mechanism to stop functioning. It's mere quibbling to argue over whether there could be some other functionality (and therefore selectability) with one or more parts removed. Behe conceded in Darwin's Black Box that there is no strictly logical barrier; it is ultimately one of plausibility or probability.
Videos from Nour foundation:
Quote:Is Intelligence the Inevitable Outcome of Evolution?
Simon Conway Morris, Ian Tattersall & Melanie Chang discuss the evolution of human intelligence and whether it is the natural consequence of evolutionary forces.



Quote:Are the Forces Driving Evolution Truly Random?
Ian Tattersall and Simon Conway Morris talk about "random" events and cause and effect in evolution.




Above are excerpts from a longer discussion:
The Story of Life: Critical Insights from Evolutionary Biology
nbtruthman Wrote:This explicitly admits to the problem of IC. It's not just any additional necessary (including accessory or interfacing) parts of the cellular apparatus we are talking about, it's the core components of the flagellar machine. Are you claiming that any of the core components (filament, hook, and basal body/stator/rotor assembly) can be removed without the flagellum losing its propulsive function? 
No, but the "original function" requirement is no longer part of the definition of IC. And even if it were, it ignores the possibility of scaffolding. That requirement was eliminated because it renders everything IC, including things no one believes were designed.

Quote:Please explain how the paper you posted (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1943423/, title The Protein Network of Bacterial Motility) refutes the irreducible complexity of the bacterial flagellum. Here I refer to the original definition of IC in Darwin's Black Box, which concentrates on single, coherent systems. I don't see how the paper has much to do with IC. If anything the extra proteins that affect mobility show the problem is worse than just needing the core proteins (like needing equipment to manufacture gasoline so that an outboard motor can work). The large number of different gene deletions having reduced or no motility has nothing to do with the irreducible complexity of the basic flagellar mechanism. This IC was demonstrated by Scott Minnich's gene knockout experiments. Most of these 159 gene deletion strains described in the paper probably interfered with motility by interfering with or deleting other interfacing components of the cell, rather than by directly disrupting or deleting the core components of the flagellum.
The original definition isn't interesting anymore. Here are the three definitions in order of appearance:

http://www.asa3.org/ASA/education/origins/ic-cr.htm

The one that Behe currently uses involves unselected "steps."

Quote:    I'm not aware of of anything in print by Michael Behe in which he formally substitutes in an actual book or paper a new operational definition of IC replacing the one in Darwin's Black Box (other than some theorizing about problems of the need for multiple unselected mutations, in The Edge of Evolution). The core of the different versions remains the observation that there are certain biological machines where the removal of any part causes the mechanism to stop functioning. It's mere quibbling to argue over whether there could be some other functionality (and therefore selectability) with one or more parts removed. Behe conceded in Darwin's Black Box that there is no strictly logical barrier; it is ultimately one of plausibility or probability.
Mere quibbling? In other words, you reject all the subtleties?

http://www.ideacenter.org/contentmgr/sho...php/id/840

~~ Paul
nbtruthman Wrote:In the Montanez et al. paper:
"1. The perceptron structure. The perceptron structure is predisposed to generating strings of ones sprinkled by zeros or strings of zeros sprinkled by ones. Since the binding site target is mostly zeros with a few ones, there is a greater predisposition to generate the target than if it were, for example, a set of ones and zeros produced by the flipping of a fair coin."

I have no idea what he's trying to say here. The entire chromosome is initialized randomly.

"In the search for the binding sites, the target sequences are fixed at the beginning of the search. "

No, they aren't.

~~ Paul
(2018-01-05, 12:30 AM)Paul C. Anagnostopoulos Wrote: [ -> ]"1. The perceptron structure. The perceptron structure is predisposed to generating strings of ones sprinkled by zeros or strings of zeros sprinkled by ones. Since the binding site target is mostly zeros with a few ones, there is a greater predisposition to generate the target than if it were, for example, a set of ones and zeros produced by the flipping of a fair coin."

I have no idea what he's trying to say here. The entire chromosome is initialized randomly.

This claim seems to be based on the idea that a "bias" is subtracted from a (potential) site's weight prior to comparing that site's weight with the threshold weight (thus predisposing it to not making the threshold, thus predisposing the process to evaluate fewer sites as binding than non-binding, which is one of the requirements for ultimate success). I see no evidence of such a bias parameter in either your code or the interface of the program though, but perhaps it was present in an earlier version of the program?

(2018-01-05, 12:30 AM)Paul C. Anagnostopoulos Wrote: [ -> ]"In the search for the binding sites, the target sequences are fixed at the beginning of the search. "

No, they aren't.

I think what they're referring to here is the effect of line 115 in Simulator.java:

Code:
   determineBindingSites();
(2018-01-05, 10:21 AM)Laird Wrote: [ -> ]This claim seems to be based on the idea that a "bias" is subtracted from a (potential) site's weight prior to comparing that site's weight with the threshold weight (thus predisposing it to not making the threshold, thus predisposing the process to evaluate fewer sites as binding than non-binding, which is one of the requirements for ultimate success). I see no evidence of such a bias parameter in either your code or the interface of the program though, but perhaps it was present in an earlier version of the program?

Actually I think I misinterpreted what they mean by bias: I think what they mean by "bias" is the threshold weight. They put it in terms of subtracting the "bias" (threshold weight) from the (potential) site's weight, and then checking whether the result is positive. I'm not sure I understand their argument after all, but if you want to examine it yourself, it's between pages 3 and 4 of their paper.
(2017-12-09, 10:03 PM)Paul C. Anagnostopoulos Wrote: [ -> ]If consciousness is an epiphenomenon, then how is it that we are talking about it?

I think people who believe it's an epiphenomenon haven't worked out the ramifications.

~~ Paul

Hi Paul,

I had a question about this, so I started a new thread under "Consciousness" for anyone who may be able to help me.

http://psiencequest.net/forums/thread-co...phenomenon

Linda
Laird Wrote:This claim seems to be based on the idea that a "bias" is subtracted from a (potential) site's weight prior to comparing that site's weight with the threshold weight (thus predisposing it to not making the threshold, thus predisposing the process to evaluate fewer sites as binding than non-binding, which is one of the requirements for ultimate success). I see no evidence of such a bias parameter in either your code or the interface of the program though, but perhaps it was present in an earlier version of the program?
I don't think so. I don't remember implementing and then removing any such parameter, though it has been a long time.

I'll check out their argument in the paper.

Quote:I think what they're referring to here is the effect of line 115 in Simulator.java:

        determineBindingSites();
Yes, the positions of the binding sites are predetermined. But their initial sequences are random.

The simulation could be modified to make the pressures even more abstract, but the point was to show that certain predicted amounts of information evolved.

The arguments about this center around what it means for information in the environment to transmute into information in the genome. Clearly there have to be environmental pressures to make anything interesting happen. Any pressure is, in some sense, "rigging" the simulation. But that's the way it works in nature, too.

~~ Paul