Deep inside your cells, the chemistry of life is hard at work to make the raw materials and channel the energy required for growth, maintenance, and reproduction. Few systems are as intricate or as mysterious. For this reason, how a cell does what it does remains a frontier for research — and, consequently, theory often grows unchecked by solid data. Most of what we know about the enzymatic processes of plant and animal metabolisms comes from test tube experiments, not studies in the context of a living organism. How much has this necessarily reductionist approach misled us, and what changes when we zoom out and think about life’s manufacturing and distribution in situ?
Welcome to COMPLEXITY, the official podcast of the Santa Fe Institute. I’m your host, Michael Garfield, and every other week we’ll bring you with us for far-ranging conversations with our worldwide network of rigorous researchers developing new frameworks to explain the deepest mysteries of the universe.
This week we open a two-part discussion with ecologist Mark Ritchie of Syracuse University on how he and his SFI collaborators are starting to rethink the intersections of thermodynamics and biology to better fit our scientific models to the patterns we observe in nature. Beginning with his history of research into biodiversity, environmental science, and plant-herbivore dynamics, this conversation leads us to his latest work on photosynthesis and scaling laws in cells — an inquiry with potent implications that reach far beyond the microscopic realm, to economics and the future of sustainability.
Subscribe to stay tuned for Part Two, in which we travel even deeper into how Mark’s work relates to other SFI research — and what his new perspectives may reveal about the nature of the complex crises faced by both human beings and the biosphere at large...
If you value our research and communication efforts, please rate and review us at Apple Podcasts, and/or consider making a donation at santafe.edu/podcastgive. You can find numerous other ways to engage with us at santafe.edu/engage. Thank you for listening!
Join our Facebook discussion group to meet like minds and talk about each episode.
Podcast theme music by Mitch Mignano.
Follow us on social media:
Twitter • YouTube • Facebook • Instagram • LinkedIn
Related Reading & Listening:
Ritchie Lab at Syracuse University | Mark’s Google Scholar Page | Mark’s soil ecology startup
Reaction and diffusion thermodynamics explain optimal temperatures of biochemical reactions by Mark Ritchie in Scientific Reports
Thermodynamics Of Far From Equilibrium Systems, Biochemistry, And Life In A Warming World [Mark Ritchie’s 2021 SFI Seminar + @SFIscience Twitter thread on Mark’s talk]
Complexity Podcast 17: Chris Kempes on The Physical Constraints on Life & Evolution
Complexity Podcast 35: Scaling Laws & Social Networks in The Time of COVID-19 with Geoffrey West
Mentioned in this episode:
Sidney Redner
Geoffrey West
John Harte
Pablo Marquet
Jennifer Dunne
Brian Arthur
Chris Kempes
Mark Ritchie (0s): Kind of my m.o. is when somebody throws a question at me that doesn't have a very clear answer, but is clearly important. My curiosity gets piqued. And so I don't really pay that much attention to the fact that I don't know much about it. I just start reading. So as I dug into the physiology, to the biochemistry involved in what causes plants to be good to eat or not, I discovered that our understanding of how temperature affects biochemistry is incomplete and not really well thought out. And not really, as we know now in very much agreement with the data.
So that's what led me to literally start from the ground up, looking at enzyme kinetics and how that all might scale up to determine something like the nitrogen content of a plant leaf. And so in the process, I linked in with some of these ideas and complexity theory, which actually turned out to be tools that enable me to set up the problem. And then I could find solutions that you might expect evolution to favor by solving for the state at which you get maximum entropy production.
Michael Garfield (1m 24s): Deep inside yourselves the chemistry of life is hard at work to make the raw materials and channel the energy required for growth, maintenance and reproduction. Few systems are as intricate or as mysterious. For this reason how a cell does what it does remains a frontier for research. And consequently, theory often grows unchecked by solid data. Most of what we know about the enzymatic processes of plant and animal metabolisms comes from test tube experiments, not studies in the context of a living organism.
How much has this necessarily reductionist approach mislead us and what changes when we zoom out and think about life's manufacturing and distribution in situ? Welcome to Complexity, the official podcast of the Santa Fe Institute. I'm your host, Michael Garfield, and every other week, we'll bring you with us for far ranging conversations with our worldwide network, rigorous researchers, developing new frameworks to explain the deepest mysteries of the universe. This week, we opened a two-part discussion with ecologist, Mark Ritchie of Syracuse University on how he and his SFI collaborators are starting to rethink the intersections of thermodynamics in biology to better fit our scientific models to the patterns we observe in nature. Beginning with his history of research, into biodiversity, environmental science and plant herbivore dynamics, this conversation leads us to his latest work on photosynthesis and scaling laws and cells and inquiry with potent implications that far beyond the microscopic realm to economics and the future of sustainability. Subscribe to Complexity, to stay tuned for part two, in which we travel even deeper into how Mark's work relates to other SFI research and what his new perspectives may reveal about the nature of the complex crises faced by both human beings and the biosphere at large. If you value our research and communication efforts, please rate and reviewus @applepodcasts and/or consider making a donation at
.
Give you can find numerous other ways to engage with us at
. Thank you for listening.
Michael Garfield (3m 43s): All right, well, Mark Ritchie, it's a pleasure to have you on complexity podcast,
Mark Ritchie (3m 48s): Delightedto be here at SFI. It's always been a place I've loved to come here several times over the years, even did us short sabbatical here a really long time ago, but it's definitely be fun to sit and talk about science and the world and all that good stuff.
Michael Garfield (4m 3s): I'd like to start where we start typically, which is on the ground in the human. A little bit of biography. It would be good to know your story, like how you got into your life as a scientist, and specifically what animated your passion, the specific topics that you've decided to spend your life researching.
Mark Ritchie (4m 24s): That's pretty interesting. So I grew up in San Antonio, Texas in the seventies, and like many other reasonably good students who were interested in biology the only career that anyone ever thought of was pre-med health, science type stuff. Most people had never even heard of ecology or environmental science at that time. So I started that as an undergrad, which actually started a year early. I skipped my senior year of high school. So I was taking some classes at University of Texas at San Antonio, and one of the required classes was an ecology class.
So we had to go out and do labs. Like we put paint on ants and use something called a mark recapture method for estimating how many ants were in an ant nest. So after we'd done about four or five of these labs, I went into the professor and I said, “I think I found what I want to do. I want to be an ecologist. And he looked at me, he goes, “nah, you don't want to be an ecologist. The best you can hope for is that, you know, you just kind of turn out to be a teacher. And it's nothing very exciting.” And I'm like, well, I don't care because I know I was born to do this.
Because I said, I've been spending my whole life thinking about all these kinds of questions that a college asks. Then he looked at me, smile said, okay, great, welcome to the team. And I started doing projects that were related to plants. So then I have the opportunity to finish my undergrad at Indiana University, where I took another class called population biology. And that's where I started falling in love with the idea of using mathematics in ecology. So I actually wrote two papers that semester on using math to look at succession and plants and stuff like that.
So then when I went to grad school, which was at the University of Michigan, I was working with professor Gary Biloski who was really into sort of connecting the idea of forging behavior of animals to these larger phenomenon, like population growth and community organization, which are all of these really tough questions that ecologists still haven't really figured out how to solve. So I did work on animal foraging and then later as a post-doc, which I did with Dave Tilman and I sort of took that basis. And then I scaled it up to sort of looking at herbivore effects on plant communities, which was sort of a big hole in the Tilman research effort at that time.
And so after my postdoc and I started at Utah State University, then there are all these Western grazing issues of our wildlife competing with cattle and or cattle destroying the landscape. And then I was combining that with some of the work that I did at Cedar Creek, where we discovered that herbivores are having these big effects on nitrogen cycling that in retrospect seems sort of obvious, but at the time people were just kind of discovering that. And so that's kind of led me to kind of always have a thread of herbivore plant soil interactions, looking at everything from nutrient cycling up to diet decision-making kinds of things.
And so then one of the questions that came up as I was a junior faculty member was I was looking at coexistence to grasshopper species. So I was creating these cages with pairwise species combinations. And I noticed that if I did the experiment early in the season, when the vegetation was pretty uniformly, good to eat, that one of the species would wipe out the other one. But if I did the same experiment later in the season, when the plants had grown up more and more, much more sort of spatially complex with leaves and stems and that kind of thing, that the grasshoppers species always co-existed.
So try and understand why would get these different experimental outcomes to make a long story short led me to start looking at the role of fractal geometry as a way of quantifying complexity in nature. So that eventually led after about eight years of various stops and starts. But including the time that I spent at SFI for about six weeks in late spring, where it was just me and I could get up at 6:00 AM and I said, “Hey, I'm at SFI. I can think about anything.”
Nothing is too crazy. So that eventually led to me writing a book or prince the monograph, where I sort of laid out the idea that you can quantify nature, a lot of nature as first approximation, as a fractal distribution, over a limited range of scales. And that turns out to be a really useful way of quantifying spatial heterogeneity or like how things are different in space and then how organisms with different sizes. They sample that encounter very different kinds of environments. So some are very much focused on the little details and others are only after big clusters of things.
So that led to looking at biodiversity from this kind of bottom up functional perspective of an individual moving through a spatially complex environment. Then later I started thinking about temperature because people started asking me questions about, well, so how does changing temperature affect these planters or interactions? And we know that one of the key things that mediates those interactions is how good are plants to eat. So some plants have high nutrient content by sugar content. Herbivores really liked to eat them and literally eat almost all of the whatever leaves and stuff that are available.
And other plants are not very good to eat that you have chemicals or they're really tough or they're thorns or something like that. And so the idea was how would temperature affect whether a lot of the plants that were available would be eaten or just avoided. So I started digging around in the, I guess this is kind of my m.o. is when somebody throws a question at me that doesn't have a very clear answer, but is clearly important than like curiosity gets piqued. And so I don't really pay that much attention to the fact that I don't know much about it from the start.
I just start reading. So as I dug into the physiology, to the biochemistry involved in what causes plants to be good to eat or not, I discovered that our understanding of how temperature affects biochemistry is well, just to put a fine point on it as I said in my talk is sort of incomplete and not really well thought out and not really, as we know now in very much agreement with the data. So that's what led me to literally start from the ground up, looking at enzyme kinetics and how that all might scale up to determine something like the nitrogen content of a plant leaf.
And so in the process, then again, I linked in with some of these ideas and complexity theory, such as thermodynamics and entropy production and those kinds of things, which actually turned out to be tools that enable me to solve the problem. I could set up the problem. And then I could find solutions that you might expect evolution to favor by solving for the state of what you get maximum entropy production. And so from that, we generated a bunch of interesting predictions about how different things should respond to temperature that seemed to agree with data initially.
And so that's kind of where I've ended up on that question, but I think hopefully with all of this storytelling here, that you can see that I'm always intrigued by questions and problems and not really stuck on any particular system. That's just what gets me out of bed in the morning. It's like, this is the kind of problem that I'll wake up at three in the morning, thinking about wheels are turning. And that's what excites me about science. It's not about how much grants you get or how many grad students I have. It's about thinking about really hard and deep problems and trying to figure out some way to solve.,
Michael Garfield (11m 57s): There's an enormous span thereAnd I hope why back over it. At least most of it, you gave us a really clear, not a linear path per se, but at least you mapped the random walk that you took from topic to topic. And I gotta say it rhymes a lot with the conversation that I had recently with Sid Redner on the show where you wanted to study all of these kind of mundane problems. Like how do you beat your kid at a game of war? Where do you find the best parking spot? And that's why I like asking these questions at the opening of the show, because something I would love to hear more of from science communication broadly is how the hypothesis is even selected.
And then how the domain of interest within which you generate a hypothesis is selected. That's something that's like oddly ignored in the first pass education on scientific methodology I feel. So at any rate you gave a talk at SFI recently on this last subject that you just mentioned on the relationship between temperature and enzyme activity. And I would like to dig in with that one a little bit, because I feel like through there, we can cast out some tendrils and connect it to some other ideas and complex systems science.
And you can tell me what you think sticks and what doesn't. As far as we research the things that are imminently important, like cattle grazing or global food production under the conditions of global warming. This is something I think a lot of people will find deeply relevant to their continued existence. Let's start with the way that biochemistry has traditionally described this system and this process, and then why that doesn't make sense.
Mark Ritchie (13m 44s): Okay, well, if you just Google like biochemistry and thermodynamics, and you look at almost any basic biochemistry textbook, and you look at several hundreds of papers that measure what are called thermodynamic properties of enzymes, almost all of it is directed at the reaction that the enzymes catalyze going forward readily, or is it more difficult to happen? And so you can essentially measure that as a change in heat that happens.
So we talk about something called like an exothermic reaction that requires, or that generates heat processes as it goes. Any process that does that so you start with two molecules or you start with enzyme in a molecule and the enzyme breaks the molecule into two pieces and generates heat. Well from the very basic definitions of entropy, that's an increase in entropy in that local area. So most of biochemistry is designed to say, so if I do this reaction, is it generating a lot of heat or is it not?
And if it is then it's usually a favorable reaction that tends to happen without a whole of trouble. And even the most basic respiration reaction, so-called burning glucose, is really just a series of sort of chopping molecules off of a larger molecule and the series of steps that releases heat or allows the formation of ATP, which then later as phosphates chopped off of it to release heat work, the cell wants it to be, so this whole issue about whether reactions are favorable or unfavorable and just how favorable or unfavorable they are is kind of really where thermodynamics and biochemistry has been for five decades or so.
And look, those even back to the thirties, when people started doing all of this, and then there was this kind of this cottage industry of taking a different enzyme or taking an enzyme from a different organism that does the same function as the one, like one in bacteria, like e-coli is a standard lab organism and comparing, is this one more likely, less likely at what point does the enzyme stop working? So I think where I was making progress on the problem was that I wasn't really asking the question of the reaction going forward or not.
What I was asking is I'm sitting here looking at it from the whole cells perspective and a cell doesn't really want to increase entropy the cell from an anthropomorphic point of view, iIt wants to keep its organization, doesn't want it to fall apart. And we all know that cells don't just burn energy and then stop. They're continually taking in materials and doing work and getting rid of materials that can't use and dissipating heat to the environment. And all of that is going on while the cell continues to exist.
So this is where I started saying is that we really wanting to know is not just simply or not whether or not a reaction is going forward, because it's easy for it to generate entropy. The issue is what does cell doing with all those things that are increasing its internal entropy. And we know that for it to continue to exist, to be organized, then it must increase the entropy of its surroundings rather than inside the cell. So that's that led me to just start formulating things from what are called reaction, diffusion dynamics, youth stuff, coming in, you know, work that gets done and stuff goes out.
So when I set the problem up that way, I looked at it and I said, this is going to be really boring if the rate at which stuff moves has the same temperature, sensitivity is which work is getting done. So the very first thing I did was before I even started playing with the math was said, well, is that really a correct assumption or not? So it was a question I asked, because I'm looking at the math, but it was a question that didn't have already answer. So I literally had to do a literature review of basically how temperature sensitive either like diffusion or transport type processes were, and then compare those to what I call product formation.
So hands-on takes a molecule and does something with it, produces a product, usually heat. And so the question is how temperature sensitive is the product formation relative to moving things? So when I did alert your review, I found that in fact, the product formation is somewhere around two to three times as sensitive as movements. So that means diffusion, if you increase temperature, it might double whereas product formation increased by like a factor of four six. So if I put those differential temperature sensitivities into my reaction diffusion framework, so now that I'm accounting for, as I increased temperature, how am I affecting the process of moving into and out of the cell, as opposed to the amount of work that gets done, then all of a sudden, all kinds of interesting things start happening.
And in classical biochemistry, they don't really couple the movement, the things to the characteristics of doing work. The other thing is that most biochemists, when they study proteins, they study proteins in a test too. They don't actually study what they do inside the cell because it's really, really hard to do. And in fact, people who work like I, a colleague of mine at Syracuse who specializes in studying and finding ways to see really ridiculously hard things that cells do, that is the cutting edge is how do I see this stuff going on?
But when they study the movements of enzymes in the cell, they find that there's all kinds of strange things going on. So people have discovered, for example, that cells are really more like they're not liquid inside, more like a gel. And so all of this means that everything that we kind of know about biochemistry in a test tube, it's not that what we've learned is useless, but there's just more things that are affecting how the biochemistry proceeds. So even in the simple reaction to fusion framework, which is way, way, way too simple for actual description of what's going on in the cell, then you get this problem that as you heat up the system, it can't dissipate products and heat fast enough to keep up with the rate at which he is being generated inside the cell.
And so you can recast that problem as one of entropy. So what's happening to entropy inside the cell and what's happening to entrophy the outside. So when I did the calculations, it turns out that under a wide range of conditions that apply to real cells and real biochemical reactions, it's certainly possible that the biochemical reactions in itself could be limited by the ability to increase entropy outside the cell. So it's a rate at which that happens, which is called entropy production, which is kind of a misnomer or people will call it like flows of entropy.
And there's all kinds of weird stuff that people use to describe it, which is not really accurate. What you're talking about is moving heat and materials. That's what flows, the processes increase entropy outside, inside. So that's what led to the whole formulation of a temperature dependent theory that says, when you get hot, you hit this limit and it's possible that you might hit this limit before you get to the point at which enzymes fall apart, which is the other half of temperature biochemistry studies, which has always been that when reaction starts slowing down, when it gets hot, that it must be because the enzyme is somehow coming apart or not being as effective at what it's supposed to do, but you can actually recapitulate what you actually get in an experiment from having limits to entropy production, without having assuming anything about the inside changing.
And a lot of experiments have actually showed that as you increase the temperature, enzymes will maintain their structure and still maintain their affinity. But the reaction still slow down. And again, this is theory and there's lots of experiments that need to be done really to test this out. And I'm arguing that the reaction is slowing down because you can't move the products away from your reaction site. And so another part of thermodynamics says that all reactions are forward and reverse. So if a product sits around, then it starts to drive the reverse reactions.
The net forward reaction slows down because there's too much reversal of a reaction. And so basically we're arguing that as you increase temperature, you get this increased reversal in the reaction. So one of the interesting things that came out of this is that there was this group and I'm not going to be able to remember their names right now, but there's a group in Europe that studying sulfide, reducing bacteria in the sediments, somewhere off the European coast. And they found almost inexplicably that as they increase the temperature, the sulfide, one of the key steps in the sulfur reduction pathway, the reverse of that reaction started overcoming the forward reaction.
And they couldn't understand why that would be. And so they reached out to me and said, well, do you think that your model might explain that? So I gave her about a one paragraph description about how it would, so then they've done some subsequent work that suggest the same thing has happened. One tiny experimental system for this mysterious result that can't really be easily explained in biochemistry. There's others. So I talked about in my talk about the fact that there's these hot mitochondria. So mitochondria are being like, if you actually use the indirect measurements of what their temperature is, it suggests they're 52 degrees C while the surrounding cytoplasm is at 35 and people are like that can't be, but interestingly, it's something that lines up pretty closely with what would be predicted by these reaction, diffusion, thermodynamics and limitation by entropy production.
So where does that take us? Well, mostly it just presents an alternative. It's like, okay, we've got this. It's like a standard model. That's been around 50 years and everyone just assumes that's the way it works, because nobody got a really conceive of a mechanism by which that wouldn't be way. So now I'm presenting an alternative that says, well, maybe in some cases it works in a different way. And if it does, then certain things will be true that you can't really predict from the standard model. And so let's go out and see whether these things are really true.
So that can take on the form of sort of like the way astronomers work. You have a theory. The theory predicts that you should see or detect certain things or a magnitude of something. So it's called the classic hypothetical deductive method. You have a mechanism that says these things should happen, but you can't really measure those things for whatever reason astronomy is, because this stuff is too far away and you can't do experiments. So then you come up with a logical prediction that seems to be unique to that mechanism and say, okay, well that I can observe.
So do I observe it or not? So we can apply that same method in this biochemistry context? No, literally put those ideas out there and stack them up against data. I mean, I don't want to get too far digressing about what, like dark, but I would like to mention that ecologists and biologists in general don't think that way. They don't think if I have a theory and should make a prediction and I should stack that up with what I should put it out there and say, okay, do the data support it, do they not? And this was a big issue in ecology in terms of trying to figure out what's organizing communities when we had the advent of what was called neutral theory.
So the neutral theory basically said, it doesn't really matter what species are really liked. Doesn't matter what traits they have. Everything is just kind of this random (?) playing out. And then there's individuals moving in and out at random. So whatever I see is just the process that, and so when they started, they would say, okay, well, we can make predictions about what a community should look like. So how many individuals or what's the frequency of species that have different sizes of populations. What’s the place they started with?
But it turns out that they didn't have independent measurements of all of the parameters in their model. So they allowed those to be fit, figuring, well, if the parameter can be fit, then that's probably good evidence that the model work well. Yes and no, it might be true, but a lot of times you can get totally different mathematical constructs. You can basically produce the same curve if you just put different numbers in. And since we didn't know what they were. So, for me, challenge comes from physics, which has a long history of doing this.
Let's pretend that we actually can come up with decent hypothesis and these can be quantitative and described mathematically, and let's put out what that quantitative prediction should be that we would expect if we're correct and put it up against the data. And if we're wrong, that doesn't mean that the theory, I mean, it could mean that the theory is bad or it could mean that we didn't have good data or it could mean that we're missing some really key components that we haven't thought about. This is really, really true in ecology because we all know that there's lots and lots of factors that could be affecting things.
And so some people just assume that that's always the case and that we can never know why anything happens. We should just study what's out there. So coming back full circle to the temperature biochemistry, it just seems to me that we kind of need to open the field up. And this reaction diffusion thermodynamics, isn't the only alternative that's been come up with. There's other people pointing out that there are changes in the heat capacity of enzymes that occur, which is also an entropy consequence.
So basically as you heat up the number of different states, the enzymes can take on what, especially once they actually bind with the molecule so that we have your enzyme substrate complex, as it's called, then the number of different states that can take on is a lot less than what the enzyme could do before it binds up. And so that causes a drop in the heat capacity of the enzyme, which then causes their reaction to slow down as you get hotter. So that's like a totally alternative thing, but it's still kind of based on thermodynamics.
So that also makes predictions that are different from the reaction to fusion models, but they're also different from the standard enzymes degrading theory. So to me, the rich or science is where we really take these new ideas and get out there and make predictions and see what the nature's actually doing, or do experiments to change certain things and see if the system responds one way or the other way, depending on what the predictions are. So I'm a long way from saying, oh, this is the theory that now explains all of the mistakes the biologist made for all these decades far from it.
It's more of, we need to be more creative in the way that we take what we know and think about how systems work. And maybe it's a little more complex than enzymes falling apart because they're getting worn. But that doesn't mean that it's so complicated that we can't understand it or predict it.
Michael Garfield (28m 30s): Well. Yeah. Okay. So once again, like so much I would like to address there, but for the sake of expedient narrative tour guiding through this, I feel like maybe the right link to cast in here is on the importance of first principles and ecological theory development. You were an author on this paper with a bunch of SFI people, Jen Dunne, Pablo Marquet, Geoff West, John Harte, and others. And you also study fractals and spatial scale.
And so I'm curious, it doesn't seem like a huge leap from the implications of this particular work to the rhymes it makes with some of the troubling consequences of specifically like scaling laws as they apply to cities. And when people are using these mechanical metaphors to talk about what's going on inside a cell as little factories in this kind of thing, it's honestly hard not to try and analogize to, you know, macro economic flows and like the way that for instance, as a city speeds up, as it gets bigger, you know, this is all for folks that are not familiar with this.
This is all stuff that we covered in episodes, 35 and 36 with Geoffrey West. And I highly recommend going back to that because his point has to do with the fractal structure of the social and infrastructural networks in a city leading to an acceleration of urban life as it scales and an acceleration of the products created by the activity of people in these societies. But then you get into supply chain issues. You get into these issues of like, how do you actually export all of this surplus productivity and looking at the way that my millennial friends are quite fond of sharing charts that show productivity in the workplace quadrupling over the last 40 years with wage stagnation.
I wonder if, like just to cast one more piece to this when we had Brian Arthur on the show and he was talking about the way that economic networks scale and that at some point it looked to him like we were going to start needing active circulatory layers in the system to redistribute wealth. He was kind of calling for in his McKinsey 2017 essay, he was calling for a complex systems argument for universal basic income. Like you need a heart pumping blood once the organism gets above a certain size and what you see in those cases, compare it to like an anthill that gets above a certain size and then suddenly it's too productive and it actually needs to like more and more of the ants need to take more paid time off, basically.
I wonder if we're not living through a kind of regime change like you show in this work where there was such a thing as just like more and more productivity and it could all be exported. And then we're in that middle regime maybe now where the rate of diffusion matters, that's the limiting factor. And then after that, there's just too much going on and you're actually doing the world a favor if you stop, like if you don't launch another podcast, for instance, like for the love of all.
Mark Ritchie (31m 53s): Another information stream.
Michael Garfield (31m 55s): Right, right. But at some point our entropy production is just too great. And we switch over into a regime where like seemingly paradoxical laws kick into dominance.
Mark Ritchie (32m 6s): Well, I it's really interesting you brought that up because it really does kind of bring home what I've been working on with Chris Kempes as part of my SFI sabbatical. He's probably frustrated with me because I've spent the last three months dealing with an explosion in the space of grazing and soil carbon and greenhouse gas reductions, which I'll talk about later if we have time. But before that happened in January, he and I were working and had been working for some time.
And mostly it's been taking a long time to do it. We're having to dig up obscure physics that people were working on various little separate contexts, but it deals that exact problem set that we're looking at what happens inside the cell, basically taking your city now and reconstructing. So we have this problem in the cell that we're doing work. And if we can't get our products out, that causes reactions to reverse. And so the actual rate of work slows down. So what it means is that cell can't just be full of enzymes because if the enzymes in the middle that they make a product that has nowhere to go, because there's no empty space for it to be able to move in and out of the cell.
And so we have this fundamental problem of the cell, the more work it does, presumably the more divisions it makes or whatever. So you can imagine that cells that do lots of work would be favored by natural selection, but at the same time, it can be full because it needs to have space for stuff to move through. So the problem we were looking at was how much space is really required. And so we then get down into issues about like, if something is moving through a fractal, like distribution of obstructions, what actually determines the likelihood that we'll leave or the rate at which something will leave from some mean position in the middle of the cell.
So in the end, we ended up combining some of the work that I done thinking about fractal distributions and fractal objects, and the ways of describing those with work that Chris had been doing with how many ribosomes were in a cell. And so like, how much should we say machinery is in the cell that's doing work. So we have succeeded in solving the problem, but only with numerical simulations at this point. So we're still looking for a more analytical solution, which I think is out there.
I just haven't had time to get to it. And Chris has also worked on it. But fundamentally what we then discovered was that these kinds of first principles would generate something that had been sort of out there, but unexplained. So when people started measuring metabolic rates of single cell organisms, all the way down to the smallest bacteria, what they found was that instead of like what Geoffrey West found, especially mostly for vertebrae data, was that metabolic rate tends to increase with mass to the 0.7, five or three quarters power.
So these original science paper that in 1997 and all the subsequent work has been about trying to understand why that exponent is three quarters as opposed to two thirds or something else. But when people looked at bacteria, they found that that relationship curved. So nobody really understood why it curved and nobody understood why it was so steep. So that even at the smallest sizes, you were actually scaling with an exponent greater than one. So this is called super linear scaling. And so it turns out that exactly what you were describing, when you start off small enough, you don't have a problem of diffusing your products.
Like the very smallest cells are barely big enough to hold some of the more standard enzymes that are used in tablets. So once they produce a product, it takes almost no time at all for that to be able to leave the cell. But as the cell gets bigger and bigger and bigger, if the enzymes are uniformly or randomly distributed throughout cell, then you end up with this dead zone in the middle where stuff can't get out. So then that means that you're going to favor cells that are full of enzymes. They're going to have a lot of empty space in there because then each one of those enzymes that is remaining can sort of act in its full capacity and you can do more total work.
And so it turns out that the scaling of those things then gives you this super linear scaling at the smaller sizes because as you add size, you sort of like double or triple the number of enzymes that you can have as you expand in the scale, because the enzymes are discrete thing. So it's kind of like if I have a trash can and I have cubes of a certain size, they don't just automatically all perfectly fill up with a circular trashcan. You're going to have some empty spaces and stuff, but the bigger, the trash can, the more likely you are to be able to more tightened, pack those in.
So that's what generates a super linear scaling. Then it just starts to bend around as you get to this point. And it turns out interestingly to be that that point at which it starts to bend around and become the exponent less than one is about the point at which you have the largest bacteria kind of reach the general upper limit. There are a few exceptions which are kind of interesting, but you switch over to eukaryotes at that point. And so eukaryotes there's like four or five different major theories about how and why they evolved to be the way they are, but they compartmentalize.
So they basically then creates these specialized organelles that do certain jobs. And they also have a lot of their enzymes on structures that are themselves fractal-like. People have actually measured the fractal dimension of these things in they're definitely not just uniformly distributed throughout the cell. So why compartmentalize? What's the advantage of doing that? The ultimate thing that happened was as you get bigger and bigger and bigger, then the exponent just settles down right to three quarters.
But the interesting thing is, is that we haven't made any assumptions about like a vascular network or any of those kinds of scaling arguments. It strictly comes out of the fact that you're optimizing the amount of work that the cell can do, given that it has to move stuff in and out. And so it's just about how much empty space. It doesn't really say anything about the particular configuration space. Other than that, we're assuming that the particles are moving according to a random walk as they bounce around inside the cell. So we're really close to having a paper finished that pulls all these concepts together.
That includes thermodynamics, which goes back to the issue that the reactions reverse if you can't dissipate products and the idea of organization and the trade-off between needing to move stuff and do work. And so it would extend almost beautifully because the curve, it is one of the interesting things is if you just put in all the standard thermodynamic parameters and diffusion coefficients and stuff like that, people have measured for cell, just take an average and you've plugged those right into the math. Then you get a curve that sits right through the data points.
You don't have to fit it. So basically it provides an alternative explanation for what's happening at single cells that don't have organized or pressurized vascular systems, but that seamlessly merges into the theory of Western all in which you do have pressurized vascular systems. And so then it asks these really cool questions. Like why are you forced to use pressurized vascular systems when you get up to about a mass of about a grant and there are organisms that use them and even smaller sizes, but that sort of seems to be the limit of which you hit that.
Why do you carry out its compartmentalize things or the other way of saying it is, how is it that bacteria are still around given that they can only get up to a certain size? And it also begs the question of, so is that what viruses problem is? Is that because they're so small, they can't fit enough enzymes. So they have to borrow somebody else's space in order to do the work so that they really are the ultimate parasite, which I guess that's everyone sort of thinks is a virus that way. But basically what we're arguing is that there's a physical explanation for why that happens at a certain site.
So that's some cutting edge stuff that we're doing, and it's really, really cool. And it kind of does synthesize a lot of the work that both of us we've been doing over the past decade or so.
Michael Garfield (40m 19s): Thank you for listening. Complexity is produced by the Santa Fe Institute, a nonprofit hub for complex systems science located in the high desert of New Mexico. For more information, including transcripts research links and educational resources, or to support our science and communication efforts, visit santafe.edu/podcast.