Abstract
Finding the global probabilistic nature of a nonequilibrium circadian clock is essential for addressing important issues of robustness and function. We have uncovered the underlying potential energy landscape of a simple cyanobacteria biochemical network, and the corresponding flux which is the driving force for the oscillation. We found that the underlying potential landscape for the oscillation in the presence of small statistical fluctuations is like an explicit ring valley or doughnut shape in the three dimensional protein concentration space. We found that the barrier height separating the oscillation ring and other area is a quantitative measure of the oscillation robustness and decreases when the fluctuations increase. We also found that the entropy production rate characterizing the dissipation or heat loss decreases as the fluctuations decrease. In addition, we found that, as the fluctuations increase, the period and the amplitude of the oscillations is more dispersed, and the phase coherence decreases. We also found that the properties from exploring the effects of the inherent chemical rate parameters on the robustness. Our approach is quite general and can be applied to other oscillatory cellular network.
PACS Codes: 87.18.h, 87.18.Vf, 87.18.Yt
1.Introduction
Circadian rhythms are an intracellular timing mechanism, widespread in living organisms, with a period of about 24 h, which fits the day/night alterations of the Earth adapting to the fluctuating environment. In Neurospora, Arabidopsos, Drosophila, and mammals, transcriptiontranslationderived oscillations originating from negative feed back regulation of clock genes have been modeled at the molecular level. The study of the oscillation behavior in an integrated and coherent way is crucial in modern systems biology for understanding how these rhythms function biologically. The underlying natures of the rhythmic behavior have been explored by many experimental and theoretical methods[1,2]. However, there are so far limited theoretical studies to explain biological oscillation behavior from global and physical perspectives.
We decided to explore an established basic model based on known biological and biochemical features of a circadian clock which has negative regulation [1,2]. In this system, the PER protein represses the transcription of its own gene, per. The core model for this circadian rhythm is shown in Fig. 1. The per gene is expressed in the nucleus and transcribed into per messenger (mRNA). Next, the mRNA is transported into the cytosol and translated into PER protein (Pc). The protein is then transported into the nucleus and becomes the nuclear form P_{N }in a reversible manner. Finally, P_{N }represses the transcription of the gene. Effectively, therefore, the network is like a self repression with time delay, in which oscillation behavior is expected.
Figure 1. Model. Model for the molecular mechanism of circadian rhythms in Drosophila.
It is important to demonstrate the robustness and stability issues of the circadian system and associated oscillation patterns. There are many possible states in the systems, and it is difficult to explore all of them and the associated global nature of the network [312]. Fortunately, not all the states have the same weights or probabilities of occurring, due to the intrinsic statistical fluctuations from the finite number of molecules in the cell and external fluctuations from highly dynamical and inhomogeneous environments in the cell [1320].
Therefore, instead of the averaged deterministic network of chemical rate equations, we developed a probabilistic description to model the corresponding cellular process taking into accounts of the intrinsic and external fluctuations. This can be realized by constructing a master equation for the intrinsic fluctuations or the diffusion equation for external fluctuations of the time dependent evolution probability rather than the average concentration for the corresponding deterministic chemical reaction network equations[16,2125]. Even for the intrinsic fluctuations, we can simplify the master equation into a FokkerPlank diffusionlike equation in the weak noise limit representing typical kinetic Markovian behavior with concentration dependent diffusion coefficients[26]. So here we use the diffusion equation to approximate the system probabilistically under the influence of either internal or external fluctuations.
By solving the FokkerPlank diffusion equation, we can obtain the probability distribution in protein concentrations evolving in time. We can also uncover the longtime steadystate probability of this chemical reaction network in analogy to the equilibrium system, where the global thermodynamic properties can be explored using the inherent interaction potentials. We will study the global stability by exploring the underlying potential landscape for the abovementioned nonequilibrium protein network. The generalized potential energy can be shown to be closely associated with the steady state probability of the nonequilibrium network in general and has been applied to a few systems[312,21,22,2729]. Once the network problem is formulated in terms of the generalized potential function or potential landscape, the issue of the global stability or robustness is much easier to address[8,30]. We notice that although the individual averaged deterministic trajectories of a nonlinear chemical reaction system might be very chaotic and complex, the corresponding statistical probabilistic distributions or the underlying landscapes which are dictated by the linear evolution equations (master equations or diffusion equations), are usually quite ordered and can often be characterized globally.
The adaptive landscape idea was first introduced into biology by S. Wright, Delbruck and Waddington [3134]. However, the link between the dynamics and the probabilistic landscape is not clear in that work. Energy landscape ideas were pushed forward by Hans Frauenfelder [35] on protein dynamics and then P. G. Wolynes and J. N. Onuchic [36] on protein folding and interactions [37]. These ideas on proteins were based on an equilibrium approach and on knowing the potentials a priori. For a nonequilibrium system, the potential landscape is not known a priori and needs to be uncovered. In fact it is the purpose of this paper to study the global robustness of oscillation with respect to the fluctuations in the cell, directly using the properties of the nonequilibrium potential landscape, which is linked to the steady state probability of the network. This provides a basis for exploring the global and physical mechanism of biochemical oscillation.
A deterministic mathematical model of this protein clock constrained by experimental data has been proposed recently [2]. For the protein network, based on MichaelisMenten enzyme kinetic equations, one can derive a set of differential equations which describe the variation rate of each component's concentration in the network. This leads to three independent simplified equations [2]:
where M is the concentration of of the clock gene mRNA, Pc is the concentration of the cytosolic protein, and P_{N }is the concentration of the nuclear forms of the clock protein. The parameter v_{s }represents maximum rate of transcription, and v_{m }is the maximum rate of transfer into the cytosol, with the Michaelis constant k_{m}. kI is the threshold beyond which the nuclear protein represses the transcription of per gene. The Hill coefficient n characterizes this repression. k_{s }is the rate of protein synthesis, and v_{d }is the maximum rate of protein degradation, with Michaelis constant k_{d}. k_{1 }and k_{2 }are the firstorder rate characterizing the transport of the protein into and out of the nucleus. The negative autoregulatory feedback is the origin of the oscillations.
As mentioned, the statistical fluctuations may be significant from both internal and external sources [1320] and in general can not be ignored. We can now formulate the FokkerPlank diffusion equation for the time evolution of the probability distributions of protein concentrations for M, Pc and P_{N}:
where D is the diffusion coefficient tensor(or matrix); here we use a uniform isotropic diagonal
matrix for simplicity. We set vector x = (M, Pc, P_{N}). We can rewrite the diffusion equation as
or in the component notation, J_{1}(M, Pc, P_{N}, t) = F_{1}(M, Pc, P_{N})P 
Once we solve for the steady state probability from the probabilistic diffusion equation, we can study the underlying properties of the potential (or potential landscape) by the relation: U(x) = ln P(x) [4,612,22,28]. We use this relationship for our nonequilibrium systems (with no detailed balance, or equivalently F ≠ ∇U) in analogy with equilibrium systems. However, unlike in equilibrium systems where only the steadystate probability is needed to characterize the global properties of the whole system, in nonequilibrium systems both the underlying landscape and the associated flux are essential for characterizing the global steady state properties as well as the dynamics of the protein network.
2. Results and discussion
The parameter values are v_{s }= 0.5 nMh^{1}, v_{m }= 0.3 nMh^{1}, v_{d }= 1.5 nMh^{1}, k_{s }= 2.0 h^{1}, k_{1 }= 0.2 h^{1}, k_{2 }= 0.2 h^{1}, k_{m }= 0.2 nM, kI = 2.0 nM, k_{d }= 0.1 nM, n = 4. The limit cycle for these values is attractive.
We solve the Fokkerplank equation using both the reflecting boundary condition J = 0 and the absorbing boundary condition. The results are similar; we choose the reflecting boundary condition in this paper. With certain initial conditions(both homogeneous and inhomogeneous), we obtain the steady probability distribution solution P_{ss }using the finite difference method at the long time limit. Then, we can use U(x) = ln P_{ss}(x) to get the generalized nonequilibrium potential function (landscape) of the circadian clock.
In order to see the results clearly, we can integrate the three dimensional probability P(M, Pc, P_{N}, t → ∞) to reduce the dimension to two. We use the formulas:
The integrated results are shown in Fig. 2. The red solid lines represent the deterministic solution of the system. We can see the potential landscape is an irregular inhomogeneous ring (the values of the potentials are represented in different colors with lower potentials in blue color) or Mexican hat like shape along the determined trajectory.
Figure 2. Integrated 2 dimensional potential landscape. The integrated two dimensional effective landscapes for the three dimensional system.
Fig. 3A (left panel) shows that the potential landscape U has a distinct closed irregular and inhomogeneous closed doughnutlike shape. In order to see clearly, we only plot only where U ≤ 25, while U > 25 is transparent. The closed doughnut is around the deterministic solution which represents the lower potential and corresponding higher probability along the oscillation trajectories. The potential is higher (and the probability is lower) outside the doughnut; this means the system is attracted to the doughnut. We found that the potential landscape distributes along the oscillation ring inhomogeneously. The potential is lower for states at which the system stays longer, which is determined by the speed at which the system passes through each state in the averaged deterministic oscillation. So the potential landscape and the steady state probability along the oscillation are not uniform due to the inhomogeneity of the time spent on each state (or the passing speed at that state) of the oscillation ring. As shown in Fig. 3B (right panel), we also observe the doughnut of the potential landscape is thicker or fatter and the values of the potential landscape along the limit cycle become comparable to or even smaller than the outside of the limit cycle when the diffusion coefficient D increases. A further increase in the fluctuations will eventually destroy circadian rhythmicity. This is because the attraction of the limit cycle becomes weaker and the time spent on the limit cycle becomes shorter. The system transforms from a clear, robust oscillation under small fluctuations to no oscillation under high fluctuations.
Figure 3. 3 dimensional potential landscape. The three dimensional potential landscape and flux for D = 1e  5 (A) on the left panel and potential landscape for D = 1e  3 (B) on the right panel.
We can clearly see the probability distribution is not distributed evenly along the limit cycle. In order to know the nature of attractive nature of the limit cycle, we have to observe the dynamics of the network. The deterministic oscillation for the three variables M, Pc, and P_{N }over a period are shown in Fig. 4(A). The forces on M, Pc, and P_{N }over the period are shown in Fig. 4(B). The speed along the cycle is shown in Fig. 4(C). Fig. 4(D) shows the corresponding limit cycle with the time marks. The sign 'star' on the limit cycle shows where the values of the force and the speed have been denoted at given times. The speed along the limit cycle has two maxima, at which the amount of time spent will be smaller than at other part of the phase space. Thus, the steady probability distribution is larger at the slower speed[1].
Figure 4. Speed. A: the deterministic oscillation for the three variables M, Pc and P_{N }over a period. B: the forces of M, Pc and P_{N }over the period. C: the speed along the cycle with time. The 'star' time parameters are as follows:t_{1 }= 3.8, t_{2 }= 8.5, t_{3 }= 15.5, t_{4 }= 21.2. D:The speed along a limit cycle: the 'star' time parameters are the same as Fig. 4C.
The divergence of the flux is equal to zero at steady state. In an equilibrium system, the flux J = 0 (detailed balance). But in a nonequilibrium system, the flux is a curl field (J = ∇ × A in three dimensions where A is a vector field). Fig. 3(A) (left panel) shows the probability flux on the closed ring landscape of the limit cycle. We can see clearly the direction of the flux near the ring is parallel to the oscillation path, like a curl.
So the flux force is the driving force for the oscillation. The potential landscape attracts the system to the closed ring and the flux force keeps the probability flow along the ring, providing the driving force for oscillation. We can see that the flux force plays a more important role along the closed ring than outside the ring because of large ∇Utype forces. Therefore, the interplay of the landscape and the flux force is the most important characteristic for a nonequilibrium system.
We can explore the global stability and robustness of the circadian clock when we obtain the potential landscape. The barrier height represents the system escaping from the oscillation attractor. Fig. 5 shows the barrier height versus the diffusion coefficient D. Barrier1 is equivalent to U_{fix }minus U_{max}, and Barrier2 is equivalent to U_{fix }minus U_{min}, where U_{fix }is the potential local maximum inside the limit cycle; U_{max }is the potential maximum along the limit cycle; and U_{min }is the potential minimum along the limit cycle. We can see the barrier height becomes larger when the fluctuations decrease. It is harder for the system to go from the doughnut of attraction to outside when fluctuations are small. This means the doughnut shape of the landscape is robust, and a stable oscillation is essentially guaranteed for small fluctuations. It also implies that the barrier height can be used as a quantitative measure of the stability and robustness of the network oscillations.
Figure 5. Barrier. The barrier height Barrier1 = U_{fix } U_{min }and Barrier2 = U_{fix } U_{max }versus diffusion coefficient D.
In a nonequilibrium open system, there are constant exchanges of energy and information with the outside environment. This results in the dissipation of energy, which gives a global physical characterization of the nonequilibrium system. The circadian clock is a nonequilibrium open system. In the nonequilibrium steady state, the system still dissipates energy and entropy which can be determined using the landscape and the flux globally, where the entropy production rate is equal to heat dissipation. In the steady state, the dissipation of energy is closely associated with the entropy production rate. The entropy formula for the system is given as [38]
By differentiating the above function, the increase of the entropy at constant temperature T is shown as follows:
where e_{p }= ∫(k_{B}T∇ ln P  F)·J dx is the entropy production rate [38], and h_{d }= ∫ F·J dx is the mean rate of the heat dissipation. For a steady state,
Figure 6. EPR. A:The diffusion coefficient D versus the entropy production rate. B:The barrier height Barrier1 = U_{fix } U_{min }and Barrier2 = U_{fix } U_{max }versus the entropy production rate.
The robustness of the oscillation with respect to the diffusion coefficient D can be quantified further by the phase coherence ξ, which measures the degree of periodicity of the time evolution of a given variable[39]. The phase coherence ξ quantitatively measures the degree of persistence of the oscillatory phase, and is
defined as follows: First, the vector N(t) = n_{1}(t)e_{1 }+ n_{2}(t)e_{2 }+ n_{3}(t)e_{3 }is shown in Fig. 7. The unit vectors are e_{1 }= (0, 1), e_{2 }= (
Figure 7. Definition of phase coherence.
where θ(φ) = 1 when φ(t) > 0, and θ(φ) = 0 when φ(t) ≤ 0, and sums are taken over every time step for the simulated trajectory. ξ ≈ 0 means the system moves stochastically and has no coherence. The oscillation is most coherent when ξ is close to 1. The value of ξ becomes larger when the fluctuations are smaller, since the trajectories are more periodic in their evolution. Fig. 8(A) shows ξ decreases as the the diffusion coefficient increases, implying that the coherence of the oscillation can be destroyed by fluctuations. Conversely, less fluctuation yields a more coherent, robust, and stable system. We also see that ξ becomes larger with a lower heat loss or entropy production rate, and conclude that less dissipation leads to more coherence. We further see that ξ increases with barrier height (Fig. 8(B)). This shows that a less dissipated network tends to preserve the coherence of the oscillations.
Figure 8. Coherence. A: The coherence versus the diffusion coefficient D and entropy production rate. B: The coherence versus the Barrier height.
We can also use stochastic simulations for various values of D to illustrate the robustness of circadian oscillation. We solve the chemical reaction network equations under the fluctuations which reflect external noise. To assess the effect of molecular noise on circadian oscillations, we have used stochastic Brownian dynamics to perform stochastic simulations of the deterministic model governed by equations(1–3). Fig. (9A and 9B) shows the distributions of the period of oscillations calculated for 2000 successive cycles. We can see that, when the fluctuations increase, the distribution becomes more spread out, but the mean period and mean amplitude are still close to the deterministic period of the oscillations. In Fig. 9(C), the standard deviation σ from the mean increases when the fluctuations increase[1,2]. This implies that less fluctuations lead to more coherent oscillations. We also see that the period distribution becomes less dispersed when the entropy production rate decreases. This shows that a less dissipated network can lead to a more coherent oscillation with a unique period instead of a distribution of periods. In Fig. 9(D), we see that higher barrier heights lead to less dispersed period distributions.
Figure 9. Period. A: The period distribution for D = 0.02. B: The period distribution for D = 0.1. C: The diffusion coefficient D versus the standard deviation of period σ_{Period }and the entropy production rate. D: The standard deviation of period σ_{Period }versus the barrier height.
We also show the distributions of the amplitude for M with increasing D. We can see the distribution becomes more dispersed but stays close to the deterministic value as the fluctuations increase in Fig. 10(A). The standard deviation σ increases when D goes up in Fig. 10(B), again showing less fluctuations leading to more robust oscillations.
Figure 10. Amplitude. A: The amplitude distribution with different D. B: The standard deviation of amplitude σ versus the D.
To explore the effects of the inherent chemical rate parameters on the robustness, we can try to find out which reactions are important, and further, which protein elements are crucial in maintaining the robustness. Fig. 11(A) shows the effects of rate parameters on the robustness. The six rate parameters increase by twenty percent (red), and decrease by twenty percent (green). The bars show the change in barrier height for different parameters. q is the percentage by which the rate constants are increase or decreased. Fig. 11(B) shows the barrier height (solid line) and the entropy production rate (dashed line) versus the six rate parameters. We can see that when the rate parameters k_{1 }and v_{m }increase, the barrier height increases and the entropy production rate decreases, as the system becomes more stable and robust. We can also see that when the other four rate parameters (k_{2}, k_{s}, v_{d}, v_{s}) increase, the barrier height decreases and the entropy production rate increases as the system becomes less stable and robust.
Figure 11. Barrier height and entropy production versus chemical rate parameters. A: Barrier changes with respect to changes of rate parameters (red: rate increase. green: rate decrease) k_{1}, k_{2}, k_{s}, v_{m}, v_{d}, and v_{s}. B. Barrier height and entropy production rate versus chemical rate parameters k_{1}, k_{2}, k_{s}, v_{m}, v_{d}, and v_{s}.
We can choose the rates k_{s }and v_{m }to further explore the period and amplitude using stochastic Brownian dynamics, since they represent the largest changes of the barrier height from the increasing the rate parameters. Fig. 12(A) shows the amplitude distribution for different k_{s }rates. Fig. 12(B) shows the amplitude center and the standard deviation σ both decrease when the rate k_{s }increase. Fig. 12(C) shows the period distribution for different k_{s }rates and Fig. 12(D) shows the period center decrease and the standard deviation σ increase when the rate k_{s }increase. This implies that the fluctuations in period measured by the variance increase as the k_{s }increases. Therefore the network becomes less stable and coherent due to the trend of larger fluctuations.
Figure 12. Amplitude and period distribution changes with respect to rate k_{s}. A: Amplitude distribution for different k_{s }rate. B: Amplitude center and the standard deviation σ versus chemical rate k_{s}. C: Period distribution for different k_{s }rates. D: Period center and the standard deviation σ versus rate k_{s}.
Fig. 13(A) shows the amplitude distribution for different v_{m }rate. Fig. 13(B) shows the amplitude center and the standard deviation σ both decrease when the rate v_{m }increase. Fig. 13(C) shows the period distribution for different v_{m }rate and Fig. 13(D) shows the period center and the standard deviation σ both decrease when the rate v_{m }increase. This implies that the fluctuations in period and amplitude measured by the variance decrease as the v_{m }increases. Therefore the network become more stable and coherent due to the trend of smaller fluctuations.
Figure 13. Amplitude and period distribution changes with respect to rate v_{m}. A: Amplitude distribution for different chemical rates v_{m}. B: Amplitude center and the standard deviation σ versus the rate v_{m}. C: Period distribution for different v_{m }rates. D: Period center and the standard deviation σ versus the rate v_{m}.
2.1. Mathematical material
We can study the network of chemical reactions in fluctuating environments:
where x = {x_{1}(t), x_{2}(t), ... x_{n}(t)} is the concentration vector, with each component of which representing different
protein species in the network. The F(x) = {F_{1}(x), F_{2}(x), ... F_{n}(x)} is the chemical reaction rate flux vector involving the chemical reactions which
are often nonlinear in protein concentrations x (for example, enzymatic reactions). The equations
Here δ(t) is the Dirac delta function and the diffusion matrix D is explicitly defined by <ζ_{i}(t)ζ_{j}(t') >= 2D_{ij}δ(t  t'). The average <...> is carried out with the Gaussian distribution for the noise. Therefore, we realize that the resulting evolution trajectories of the protein concentrations are stochastic. So instead of following the determinist path of probability equal to one, we now need to quantify the probability of specific paths. The probabilistic description is more appropriate for the system under fluctuating environments. The probabilistic evolution follows a FokkerPlanck diffusion equation as discussed in the main text.
We can explore the long time steady state properties and collect the statistics to obtain the steady state distribution function P_{0}(x) for the state variable x (representing the protein concentrations of the protein network in this case). In the equilibrium systems where a potential U where the force is a gradient of it, P_{0}(x) is exponentially related to potential energy function U(x). So we obtain the information of steady state probability from U. For the nonequilibrium system, we do not know the information of the potential a priori. But we can obtain the information of the steady state probability by solving the probabilistic evolution equation and taking the long time limit. In analogy with the equilibrium system, we can define the generalized potential U for the nonequilibrium case from the steady state probability [4,6,7,9,10,22,28]:
with the partition function Z = ∫ dx exp{U(x)}. The rational for the definition of the nonequilibrium potential this way is given earlier in the main text due to the driving force (for the dynamics) decomposition as gradient of a potential and curl flux. From the steadystate distribution function, we can therefore identify U as the generalized potential function of the network system. In this way, we map out the potential landscape. Once we have the potential landscape, we can discuss the global stability of the protein cellular networks.
3. Conclusion
We have shown that we can explore the global features of the circadian rhythms model. Finding the potential landscape and associated flux is the key to addressing the robustness issue of the networks. We have uncovered the underlying potential landscape of a circadian clock. This is realized by explicitly constructing the probability of the states of the protein network by solving the corresponding probabilistic diffusion equation. The landscape of the oscillation has an irregular and inhomogeneous closed ring valley or doughnutlike shape. We also found that the flux along the cycle path is the driving force for coherent oscillation. The potential barrier height for escaping from the limit cycle attractor determines the robustness and stability of the network oscillations. We found as the diffusion coefficient becomes smaller, the potential barrier becomes greater, and furthermore the statistical fluctuations are effectively more severely suppressed. This leads to robustness of the biological limit cycle basin of the protein network.
We observe the global dissipation in terms of the entropy production of the whole system increases when the diffusion coefficient D increases. The period and the amplitude distribution becomes widely dispersed when D increases, and the phase coherence decreases. These are three ways of characterizing the robustness of the oscillation in addition to the barrier height measure from the basin of the attraction. Low entropy production might serve as a design principle for robust networks.
The robustness, coherence, and dissipation of the circadian oscillations with respect to the changes with the rate parameters can be studied as well. And we found protein element k_{s }is crucial in maintaining the robustness in the network.
Acknowledgements
JW thank the supports from National Science Foundation Career Award and American Chemical Society Petroleum Fund. LX and EKW are supported by the National Natural Science Foundation of China (Grant no. 90713022 and 20735003).
References

Gonze D, Halloy J, Goldbeter A:
Proc Natl Acad Sci. 2002, 99:673678. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

J Chem Phys. 2002, 116:1099711010. Publisher Full Text

J Phys A Math Gen. 2004, 37:L25L30. Publisher Full Text

Funct Integr Genomics. 2004, 4:188195. PubMed Abstract  Publisher Full Text

Biophys Chem. 2005, 114:213220. PubMed Abstract  Publisher Full Text

Phys Rev Lett. 2005, 94:028101. PubMed Abstract  Publisher Full Text

Wang J, Huang B, Xia XF, Sun ZR:
Biophys J Lett. 2006, 91:L54L56. Publisher Full Text

Journal Cover Article, Biophys J. 2007, 92:3755. Publisher Full Text

Wang J, Huang B, Xia XF, Sun ZR:
PLOS Comp Biol. 2006, 2:e147.
1385.
Publisher Full Text 
PLoS Comput Biol. 2007, 3(3):e60. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Proc Natl Acad Sci. 2008, 105:6039. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Proc Natl Acad Sci. 2008, 105:12271. PubMed Abstract  Publisher Full Text

Proc Natl Acad Sci USA. 1997, 94:814819. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Nature. 2000, 403:335338. PubMed Abstract  Publisher Full Text

Swain PS, Elowitz MB, Siggia ED:
Proc Natl Acad Sci USA. 2002, 99:1279512800. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Proc Natl Acad Sci USA. 2001, 98:86148619. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Vilar JMG, Guet CC, Leibler S:
J Cell Biol. 2003, 161:471476. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Nature. 2004, 427:415418. PubMed Abstract  Publisher Full Text

Hasty J, Pradines J, Dolnik M, Collins JJ:
Proc Natl Acad Sci USA. 2000, 97:20752080. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Hasty J, Isaacs F, Dolnik M, McMillen D, Collins JJ:
Chaos. 2001, 11:207220. PubMed Abstract  Publisher Full Text

Gardiner CW: Handbook of stochastic methods for physics, chemistry and the natural sciences. In Chaos. Berlin: SpringerVerlag; 1985:475.

van Kampen NG: Stochastic processes in chemistry and physics. In Chaos. Amsterdam: NorthHolland; 1992:419.

J Phys Chem. 1977, 81:23402361. Publisher Full Text

Genetics. 1998, 149:16331649. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Biophys J. 2001, 81:31163136. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Qian H, Saffarian S, Elson EL:
Proc Natl Acad Sci. 2002, 99:1037610381. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Proc Natl Acad Sci USA. 2003, 100:23742379. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Walczak AM, Sasai M, Wolynes PG:
Biophys J. 2005, 88:828850. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Austin RH, Beeson K, Eisenstein L, Frauenfelder H, Gunsalus I:
Biochem. 1975, 14:53555373. Publisher Full Text

Fisher RA: The genetical theory of natural selection. In Biochem. Oxford: Clarendon; 1930:251.

Wright S: The roles of mutation, inbreeding, crossbreeding and selection in evolution.
Proceedings of the Sixth International Congress on Genetics 1932, 356366.

Unites Biologiques Douees de Continuite Genetique Colloques Internationaux du Centre National de la Recheche Scientifique, Paris: CNRS 1949.

Waddington CH: Strategy of the gene. In Unites Biologiques Douees de Continuite Genetique Colloques Internationaux du Centre National de la Recheche Scientifique. London: Allen and Unwin; 1957:290.

Frauenfelder H, Sligar SG, Wolynes PG:
Science. 1991, 254:15981603. PubMed Abstract  Publisher Full Text

Wolynes PG, Onuchic JN, Thirumalai D:
Science. 1995, 267:16191622. PubMed Abstract  Publisher Full Text

Phys Rev Lett. 2003, 90:188101. PubMed Abstract  Publisher Full Text

Yoda M, Ushikubo WT, Inoue Sasai M:
J Chem Phys. 2007, 126:115101.
1–11.
PubMed Abstract  Publisher Full Text 
Science. 2002, 295:16691673. PubMed Abstract  Publisher Full Text

Proc Natl Acad Sci. 1996, 93:1007810082. PubMed Abstract  Publisher Full Text  PubMed Central Full Text

Eur J Biochem. 2000, 267:15831593. PubMed Abstract  Publisher Full Text

Science. 2001, 292:929933. PubMed Abstract  Publisher Full Text

Nature. 2000, 407:651654. PubMed Abstract  Publisher Full Text