Paper #6 

Paper: Bradley

Respondent: Van Till

Discussion

Bradley: Dr. Van Till took exception in his response to my saying, "While in principle this may not be impossible, necessary energetic pathways are apparently not readily available." He thinks the "apparently" is simply arguing from ignorance. I would say that's not arguing from ignorance. It is arguing from 30 years of unsuccessful experimentation in trying to make molecular building blocks unless we start with energy rich precursors. I was trying to be charitable in saying that apparently there aren't any energetic pathways, because in 30 years we haven't found any successful end product.

The second comment has to do with what I added to my presentation concerning the significance of energy flow through the system. I was going pretty fast, and possibly I was misunderstood. I would like to put the overhead up again to help clarify. Wicken has said that energy flow through the system really helps to drive things forward. From this overhead I made a very specific point. If I had energy-rich compounds to begin with so that my source of energy to drive the reaction forward is chemical energy, not photons from the sun, then I can make building blocks without any problem. The energy released to the surroundings, the associated change in the dI energy term, ends up being more than adequate to drive the reaction forward to form more complicated molecular structures. On the other hand, if I start with energy lean compounds and I don't have available chemical energy, so that I'm going to depend on energy from the sun, then I'm up the creek without a paddle. That was the point I was trying to make. And again, that is not a point about what is apparently the case. It is 30 years of experimental work that shows that to be the case.

Wicken goes on to claim what I believe is false, that more complicated molecular structures can be made using configurational entropy as the driving force. It is like trying to put out a raging fire with a water pistol. There simply isn't enough driving force to make more than trivial yields of any more complicated molecular structures. I believe that Wicken's claim that this is a "highly creative," very powerful driving force is simply nonsense. Now you've got two choices. Either you have chemical energy or you have some way of coupling energy flow through the system. At this point we have 30 years of unsuccessful experiments to show that coupling energy flow, whether photons or other energy, to the system simply doesn't appear to work. I say appear because we haven't done it naturally in 5 billion years and we have no good experimental results in 30 years. The only way we get good results is to use energy rich starters. That's fine when you are beginning, because you can begin with methane, ammonia, and hydrogen, and you can make simple building blocks. As soon as you make simple building blocks, you are in serious trouble even in trying to get polymerization to occur.

You are trying to drive energetically very unfavorable reactions to form more complicated molecular structures. And that ends up being very difficult. The root problem, however, is sequencing. It's in the area of sequencing that I believe Wicken does a lot of hand-waving. He basically says we are not going to build component by component until we get a molecule, and then hook it up with another one. We are going to have some organismic way of growing these things that allows them to be selected based on how they are able to use the available energy. I believe that's all eye wash because you've got to have some minimal level of complexity to have any kind of biological function, and to be able to benefit from the sort of selection that we are talking about. It's at that point that I believe Wicken makes the same mistake that other people make in simply assuming that we can jump up to some functional level of biological behavior that can energetically interact with the environment. Until you get to that level this whole business of selection based on function doesn't have any meaning. That's a very big first step whether you're talking about a single protein or a group of them acting in concert. To me this doesn't make the problem simpler, it makes it worse. That's essentially the paradigm that he offers as a substitute for the one we now have. I don't find that to be progress in any real sense of the word. It's an interesting alternative but I don't believe it's necessarily an even more likely one.

Mills: To me that was a very fine presentation. There are a couple of points I want to make. I know he and Dr. Van Till know this, but it might not be known to everyone else. When we are talking about these properties we are not talking about factors that relate to rates because this is where enzymes come in. This is going to go on from there, at least minimally in regard to rates. So when something is thermodynamically possible, it doesn't mean it necessarily proceeds in an appreciable rate.

One question I would like to ask Walter (Bradley). He used the illustration of hydrogen and nitrogen to form ammonia. What about for example, do the calculations come out the same, and I presume you inferred they would, if you start with glycine and made a polymer of glycine where you had the formation of a peptide bond. Is this entropy factor that Wicken is talking about, under those circumstances, as trivial as compared to the thermodynamic factor, which in that case would be endothermic unless you activate the glycine?

Bradley: Where you are trying to overcome what amounts to an energy barrier rather than an energy driving force, with the configuration entropy as your primary means, it turns out you get extremely small yields, and that's basically the problem. It's analogous to getting water to run downhill, and if the wind is blowing in that direction, does it help? Yes, a little. Now if I have the situation I have in most polymerization condensation reactions where energetically I'm having to go uphill then I've got a much bigger problem. Because what I'm saying now that configuration entropy driving force is like wind trying to blow the water uphill. It turns out it is not very effective in doing that. To put it another way, we get extremely small yields from that driving force. It is just very very impotent and it sort of bombs out at an extremely low yield level.

Mills: Which is why, when you hear the talk about the translation system that I'm going to be giving later, that you have to have some sort of activation of the carboxyl group of the amino acid before you can get polymerization. In the case there you use ATP to form an amino acid adenylate first, so you have an anhydride bond which is inactive and then your reaction is thermodynamically possible.

Bradley: In living systems it's probably fair to point out, and I believe that's what you're referring to, Gordon (Mills), that all of the successful reactions are not driven by configuration entropy. They're all basically driven by coupling them to energy rich reactions which make that energetic driving force the one that controls the show. And the question is, can you take some very miniscule driving force which Wicken erroneously calls "highly creative" and make highly complex molecular structure? Well, it is true if you start with a very large number of simple monomers, you can probably get some degree of polymerization based on statistics, that's what we are talking about, statistical driving force against an energy gradient that goes uphill. What you end up with are extremely trivial yields. That's what Wicken refers to as a creative driving force for the development of complicated molecules. I guess it is beauty in the eye of the beholder. But to me that is like spitting into the wind. I don't consider that any sort of a successful driving force at all. At this point we simply don't have experimental confirmation that you can get any place with that. But the calculations are straight forward enough you don't even need them. You can calculate the kind of yields you get and the kind of yields you get are so small you can't possibly hope to make anything with the kind of end product yields mixed in with a lot of other junk that's going to have no biological use, and hope to even assemble things into a useful functioning system whether you are talking about an individual molecule or a small group of molecules. He only complicates the problem by trying to argue that because you've got to have a small group of complicated molecules that function together in some coordinated way to begin to benefit from the selection based on function which he claims essentially provides the information. The problem is you can't get that to work until you jump up a pretty good step function in information to begin with. That's the fundamental problem. Once you get a certain functioning system, then selection becomes a factor. He oversimplifies a new difficulty to get there.

Wilcox: One thing you've got to have in a system that selects is the ability to duplicate the molecule, with error or without error. You can't select if you can't duplicate. So that's got to be the minimum. You've got to get to the point where molecules can be replicated before you can select anything.

Bradley: I agree with that. But I add that if all they can do is replicate and there is no differentiation in terms of function then there's still no basis for selection, so you need both.

Wilcox: That's true. Yes.

Bradley: If you don't have one that, for example, manages to harness energy and use it more efficiently than another, then if you can replicate all of them, and they`re all equally crummy, then that also ends up being a kind of meaningless start.

Wilcox: Would you comment on Eigen's idea of RNA enzymes as having the potential of both.

Bradley: I had lunch with Eigen at the Gordon Conference a year ago (1987), last summer in August, and basically what Eigen agreed, I might add that Wicken thinks Eigen's ideas aren't very workable either, but the fundamental problem with Eigen's ideas are that you're starting with a very complicated system to begin with. If you look at his Hypercycles they are very complicated, and he agreed as we visited at lunch that that's not a good origin of life model at all. It starts with too much complexity to begin with. It is a model of how a hypothetical first system might progress, once in place. But he was more than willing to agree it's a very complicated first step and probably can't be realistically be considered an origin of life model anyhow. Wicken has a lot of interesting and I believe very worthwhile comments in his book on Eigen's model, that we don't see anything that would suggest that was an intermediate pathway at some point in time. So there are other basic biological reasons for doubting it, but the fundamental reason to say that is not a good origin of life model is that it is simply very complicated to begin with. We can't even make RNA under prebiotic conditions right now. We can't make all the different associated molecular machinery that his hypercycle requires, and by his own admission it is not a good origin of life model. There doesn't appear to be anything that functions that way in living systems today.

Thaxton: Bradley mentioned that he was basing his conclusions not on ignorance but on 30 years of experiments. I am sometimes asked to comment on all the progress in the origin of life field. There is an illusion here because of an artifact of greater precision in analytical techniques to measure smaller and smaller quantities in these reactions that have been studied over the past 30 years. So as the analytical techniques get better, to measure smaller and smaller quantities, we have new papers telling what was in those reactions performed 30 years ago. You look at Miller's early reports of his famous experiment. He reported what was found with techniques available back in the 50's. He's still analyzing the same system with greater precision with better techniques. So there's that which must be considered as well.

Some have talked about the possibility of converting from one type entropy to another type. That's related in a way to the same kind of questions of conversion of information from one type to another. Prof. Orgel, in his book, The Origins of Life talks about information transformation. I know about energy transformation, heat to electricity, etc., so I wrote to Prof. Orgel to ask what he meant. I said that the notion of information transformation sounds like I could listen to the weather report and get the stock averages for tomorrow. He wrote back and said, well, really he was trying to develop an acronym and he simply needed something beginning with the letter T to go with information, and transformation is what he used. (Laughter)

So if you read Orgel's papers and run across this idea of information transformation he doesn't really believe you can get tomorrows stock averages.

The idea of information as we have discussed it so far, there is an ambiguity in our terminology. We are still unclear about it. So I hope as we progress through this conference we might gain some clarity and be able to cross-fertilize more effectively among the various disciplines represented here.

Rust: In your condensation reaction A + B --> D, you took 50 molecules of each and got a result of K ln 25, how about taking just one of each. Did you do the calculation?

Bradley: No, I didn't but what you'll find is that the configuration entropy driving force, on the statistical basis, will go up as the total number of molecules goes up. Because it is basically a statistical factor. If I have a larger system then the chances of getting at least one C to form from all these A's and B's goes up even if it is energetically unfavorable.

Rust: So in fact it's just a question of the tail of a probability distribution?

Bradley: Yes, it would be like a tail on a distribution. Probably the best way to look at it though is to, if you describe the number of ways you can arrange the system before the reaction, and then differentiate that with regard to reaction, you find the slope at the ends is infinite, and as you get into more reasonable product it flattens out very quickly. And where it flattens out and what becomes your optimal or equilibrium reaction end product depends very much on whether the energetic driving force is for you or against you. If it's even slightly against you, which is what you have in condensation polymerization reactions, then the equilibrium end product ends up being extremely small. That's why I'm saying it is an impotent driving force. If you have 20 million molecules of A and 20 million molecules of B and you say what's the chance of two of those reacting to form C, no matter what the energetics the chance of getting one's pretty good. The problem is that if you've got one or ten or a small number in a big sea of this other stuff then what are the chances of them combining with yet something else, and something else, and finally, at each point your yield becomes its own kind of problem in trying to make the next step in the reaction if you've got a whole series of these. What you end up with then is such a trivial yield that to call this a highly creative way of making more complex molecules is, I believe, out of touch with reality. I don't know how else to put it. If you do the calculation it's ridiculous. it's the only way I can say it.

Rust: Just a detail in your equation, which you corrected. You still need water to have an oxygen source.

Bradley: Yes, okay. I was just putting out some energy rich compounds. I was negligent in the details. Thanks.

Wright: I just want to comment on this concept of the trivial yield that people are mentioning. As I think about a 0.000l% yield at least in current biological systems that's a very significant yield. That is an aquatic system, you started with one mole of your substance that would be a 100 micro moles yield, which would be very significant. Living organisms, in particular bacteria, have a tremendous capacity to exist on very low concentration of substrate. At least in terms of present day organisms a yield like that would be quite significant. If there were in the prebiotic stage of things, some way to concentrate that sort of low yield, take it out of the system, absorb it onto a surface, or something like that, then the reaction would continue to go on and continue to produce even at a very low yield. If you continued to produce these more complex compounds then you do have a system which is creative.

Bradley: My only comment to that is that living systems are different in that they have catalysts which allow you to live with very low concentrations. In the kind of system we are talking about the problem is the real amino acid system. The Miller-type experiments produce lots and lots of compounds with very very low yield of biologically significant amino acids, and it happens that these amino acids are much more reactive with many other compounds than they are with each other. So when you say you are going to get a yield like one in a million and that one wants to react with the other 999,999 more than it would want to react to another of its own kind are to associate with it, then you have got a very serious problem, particularly when there is no catalyst to guarantee the right kind of connection. So I believe any comparison of living systems with prebiotic systems where we don't have catalysts, is sort of irrelevant. It's the catalyst we are trying to make. Particularly when we are talking about an environment with many other chemicals it would far rather react with than the ones it ought to be reacting with. When Sidney Fox runs his experiments you notice he never begins with the end product of a Miller-type experiment. He begins only with amino acids. He does that for a very good reason. Amino acids are much more reactive with other things than they are with each other. To say that's a very significant yield given the environment it's in and given that you don't have any catalyst to force it to react with a very low yield of some other type, well, I don't believe it is a significant point by comparison.

Thaxton: This problem of separation, of course, is where experimental chemists get their salary. Probably the most difficult thing for any experimental chemist is the fact that all the side reactions and by products are generally produced in far greater abundance than what you are trying to make. So you have to get rid of those by-products to be able to find something meaningful. It is a very difficult problem and is the reason you don't find reported in the origin of life literature these kinds of more realistic experiments for how to produce life. A couple have proposed it, I know David Usher at Cornell has written to the effect that we need to simulate a real prebiotic pond, where we allow all the different chemicals to exist together. Of course, the reason it has not been done, is not lack of funding, it is the difficulty of doing anything meaningful.

Bradley: It's the ability to predict the result and know it's not going to be what you want.

Buell: I'd like to ask a question to relate this discussion back to the larger theme of the conference. What is there in the empirical world that exhibits both high complexity, or to use the term, configuration entropy, and functional integrity that we ascribe to natural cause, or non-intelligent cause, DNA aside?

Thaxton: Proteins.

Buell: Life aside.

Bradley: Computers are a half example because they have a high degree of configuration entropy but we certainly don't ascribe them to natural process. There are lots of examples you can think of that function only because of their high degree of configuration entropy. There are lots of examples but they are all machines of man. There is a high degree of correlation with how efficiently machines can process energy and how complex they are.

Thaxton: I would point out that the thing that is always neglected in the origin of life literature, is that here we are trying to explain how something happened without the role of an intelligence being involved at all. And then we have the simulation experiments that are supposedly mimicking what nature did without intelligence. Yet the difficulty is, how do you really get clever enough to eliminate the role of intelligence in your experimental system. That's really the trick. The more realistic the system the more general your results. As you introduce constraints or restrictions on the system, and the only way that has been done is to take out a few of the reactants from the system, or to change the temperature at the right time, etc. These are all changes that the investigator makes. The problem is how to realistically mimick a primitive earth situation. That's the practical problem. Experimentalists continue to do this. If you read this literature with care you will notice that we don't have realistic criteria for how to evaluate and restrict this role of the investigator. At origin of life conferences occasionally the subject comes up and participants will talk about it. I know of no other discipline of experimental science where the criteria of a sucessful experiment are not nailed down. This is a very disturbing thing in the origin of life literature. I'm hoping that one of these days we'll get more realistic with that and nail down the criteria for successful experiments. Until we do I believe we are going to continue to have the same kind of confusion in the area of these experimental results as we've been having here at this conference concerning what we mean by information. It is just very difficult until you nail things down.

Maybe we need a whole team of philosophers to come in here to help us nail down some of this.

Nelson: Charlie, that would only make matters much worse, believe me.

(Laughter)

Thaxton: Which is the reason you were left until Sunday.

(Laughter)

Van Till: I'm still puzzled, Walter (Bradley), about a statement made toward the bottom of page 7. As I read it it sounds like you are saying that any reshuffling of the components of a biopolymer ends up in a net loss of information. You can't be intending to say that yet I'm getting that out of that paragraph. Can you clarify this business of coding and reshuffling?

Bradley: In Wicken's book he basically says, and I agree with this, that there is very little, if any, energy differentiation between sequencing. The way you distribute the thermal energy is also unaffected by that. So if you take his original equation for what has to happen, all you are left with is the term that has to do with configuration entropy. Now the point is, the sense in which you have to do configuration entropy work is that in a natural chemical reaction you could get any one of many different molecules, let's say 1061 if we are making something like cytochrome c, and only one of those would actually work in practice. So that's hypothetical. There may be more that work in practice but the point is I've got a real one once I've got a sequence. And I'm doing this artificially in two steps. I'm saying the first problem is to get the monomers to connect, and then the second problem is to get them to have the right sequence. In practice you wouldn't have that in two steps. You would really have that in one step. They would polymerize right, there's no real reshuffling after they polymerize. To make the calculation it is more convenient to divide it into two steps. The first step of which is what are the energetics of getting it to polymerize. And the second step is what are the additional, if any, energetics having to do with getting the right sequence. But you're right, in the sense that if I really did this in practice that would all be part of one single step. You can break it into two steps and then simply have the dIe, dIth, and dIc as sums from those two equations because in practice it would happen in one step. So in reality you don't have the problem you've raised. It is a computational convenience. Wicken separates those. And I did also in our book, The Mystery of Life's Origin.