# Robustness, dissipations and coherence of the oscillation of circadian clock: potential landscape and flux perspectives

- Jin Wang
^{1, 2}Email author, - Li Xu
^{1, 3}and - Erkang Wang
^{1}Email author

**1**:7

**DOI: **10.1186/1757-5036-1-7

© Wang et al 2008

**Received: **29 August 2008

**Accepted: **30 December 2008

**Published: **30 December 2008

## Abstract

Finding the global probabilistic nature of a non-equilibrium circadian clock is essential for addressing important issues of robustness and function. We have uncovered the underlying potential energy landscape of a simple cyanobacteria biochemical network, and the corresponding flux which is the driving force for the oscillation. We found that the underlying potential landscape for the oscillation in the presence of small statistical fluctuations is like an explicit ring valley or doughnut shape in the three dimensional protein concentration space. We found that the barrier height separating the oscillation ring and other area is a quantitative measure of the oscillation robustness and decreases when the fluctuations increase. We also found that the entropy production rate characterizing the dissipation or heat loss decreases as the fluctuations decrease. In addition, we found that, as the fluctuations increase, the period and the amplitude of the oscillations is more dispersed, and the phase coherence decreases. We also found that the properties from exploring the effects of the inherent chemical rate parameters on the robustness. Our approach is quite general and can be applied to other oscillatory cellular network.

**PACS Codes:** 87.18.-h, 87.18.Vf, 87.18.Yt

## 1.Introduction

Circadian rhythms are an intracellular timing mechanism, widespread in living organisms, with a period of about 24 h, which fits the day/night alterations of the Earth adapting to the fluctuating environment. In Neurospora, Arabidopsos, Drosophila, and mammals, transcription-translation-derived oscillations originating from negative feed back regulation of clock genes have been modeled at the molecular level. The study of the oscillation behavior in an integrated and coherent way is crucial in modern systems biology for understanding how these rhythms function biologically. The underlying natures of the rhythmic behavior have been explored by many experimental and theoretical methods[1, 2]. However, there are so far limited theoretical studies to explain biological oscillation behavior from global and physical perspectives.

*mRNA*). Next, the

*mRNA*is transported into the cytosol and translated into PER protein (

*Pc*). The protein is then transported into the nucleus and becomes the nuclear form

*P*

_{ N }in a reversible manner. Finally,

*P*

_{ N }represses the transcription of the gene. Effectively, therefore, the network is like a self repression with time delay, in which oscillation behavior is expected.

It is important to demonstrate the robustness and stability issues of the circadian system and associated oscillation patterns. There are many possible states in the systems, and it is difficult to explore all of them and the associated global nature of the network [3–12]. Fortunately, not all the states have the same weights or probabilities of occurring, due to the intrinsic statistical fluctuations from the finite number of molecules in the cell and external fluctuations from highly dynamical and inhomogeneous environments in the cell [13–20].

Therefore, instead of the averaged deterministic network of chemical rate equations, we developed a probabilistic description to model the corresponding cellular process taking into accounts of the intrinsic and external fluctuations. This can be realized by constructing a master equation for the intrinsic fluctuations or the diffusion equation for external fluctuations of the time dependent evolution probability rather than the average concentration for the corresponding deterministic chemical reaction network equations[16, 21–25]. Even for the intrinsic fluctuations, we can simplify the master equation into a Fokker-Plank diffusion-like equation in the weak noise limit representing typical kinetic Markovian behavior with concentration dependent diffusion coefficients[26]. So here we use the diffusion equation to approximate the system probabilistically under the influence of either internal or external fluctuations.

By solving the Fokker-Plank diffusion equation, we can obtain the probability distribution in protein concentrations evolving in time. We can also uncover the long-time steady-state probability of this chemical reaction network in analogy to the equilibrium system, where the global thermodynamic properties can be explored using the inherent interaction potentials. We will study the global stability by exploring the underlying potential landscape for the above-mentioned non-equilibrium protein network. The generalized potential energy can be shown to be closely associated with the steady state probability of the non-equilibrium network in general and has been applied to a few systems[3–12, 21, 22, 27–29]. Once the network problem is formulated in terms of the generalized potential function or potential landscape, the issue of the global stability or robustness is much easier to address[8, 30]. We notice that although the individual averaged deterministic trajectories of a non-linear chemical reaction system might be very chaotic and complex, the corresponding statistical probabilistic distributions or the underlying landscapes which are dictated by the linear evolution equations (master equations or diffusion equations), are usually quite ordered and can often be characterized globally.

The adaptive landscape idea was first introduced into biology by S. Wright, Delbruck and Waddington [31–34]. However, the link between the dynamics and the probabilistic landscape is not clear in that work. Energy landscape ideas were pushed forward by Hans Frauenfelder [35] on protein dynamics and then P. G. Wolynes and J. N. Onuchic [36] on protein folding and interactions [37]. These ideas on proteins were based on an equilibrium approach and on knowing the potentials a priori. For a non-equilibrium system, the potential landscape is not known a priori and needs to be uncovered. In fact it is the purpose of this paper to study the global robustness of oscillation with respect to the fluctuations in the cell, directly using the properties of the non-equilibrium potential landscape, which is linked to the steady state probability of the network. This provides a basis for exploring the global and physical mechanism of biochemical oscillation.

where *M* is the concentration of of the clock gene *mRNA*, *Pc* is the concentration of the cytosolic protein, and *P*_{
N
}is the concentration of the nuclear forms of the clock protein. The parameter *v*_{
s
}represents maximum rate of transcription, and *v*_{
m
}is the maximum rate of transfer into the cytosol, with the Michaelis constant *k*_{
m
}. *kI* is the threshold beyond which the nuclear protein represses the transcription of per gene. The Hill coefficient *n* characterizes this repression. *k*_{
s
}is the rate of protein synthesis, and *v*_{
d
}is the maximum rate of protein degradation, with Michaelis constant *k*_{
d
}. *k*_{1} and *k*_{2} are the first-order rate characterizing the transport of the protein into and out of the nucleus. The negative autoregulatory feedback is the origin of the oscillations.

*M, Pc*and

*P*

_{ N }:

*D*is the diffusion coefficient tensor(or matrix); here we use a uniform isotropic diagonal matrix for simplicity. We set vector

**x**= (

*M, Pc, P*

_{ N }). We can rewrite the diffusion equation as $\frac{\partial P}{\partial t}$ + ▽·

*J*(

**x**,

*t*) = 0 and define the flux vector of the system as:

or in the component notation, *J*_{1}(*M*, *Pc*, *P*_{
N
}, *t*) = *F*_{1}(*M*, *Pc*, *P*_{
N
})*P* - $D\frac{\partial}{\partial M}P$, *J*_{2}(*M*, *Pc*, *P*_{
N
}, *t*) = *F*_{2}(*M*, *Pc*, *P*_{
N
})*P* - $D\frac{\partial}{\partial Pc}P$ and *J*_{3}(*M*, *Pc*, *P*_{
N
}, *t*) = *F*_{3}(*M*, *Pc*, *P*_{
N
})*P* - $D\frac{\partial}{\partial {P}_{N}}P\dot{\text{J}}$ is the steady state probability flux when ∇·*J*(**x**, *t*) = 0. It is obvious that in the steady state the divergence of *J* must vanish. One can not conclude, however, that *J* itself must vanish. Only in the equilibrium situation where the systems satisfy detailed balance, *J* = 0. For the non-equilibrium system in general, the steady state contains a circulating flow with nonzero curl. This is because ${J}_{ss}=F{P}_{ss}-D\cdot \frac{\partial}{\partial x}{P}_{ss}$. Therefore, $F=D\cdot \frac{\partial}{\partial x}{P}_{ss}/{P}_{ss}+{J}_{ss}/{P}_{ss}=-D\cdot \frac{\partial}{\partial x}(-\mathrm{ln}{P}_{ss})+{J}_{ss}/{P}_{ss}=-D\cdot \frac{\partial}{\partial x}U+{J}_{ss}/{P}_{ss})$. *P*_{
ss
}stands for steady state probability. Although **F** in general can not be represented as a potential gradient, the driving force for the dynamics can be decomposed to two terms for non-equilibrium network systems: one is associated with the gradient of a potential closely linked to the steady state probability and the other is associated with a divergent free field. The divergent free field has no sources or sinks to start or end the force lines and therefore is recurrent or rotational in nature [12].

Once we solve for the steady state probability from the probabilistic diffusion equation, we can study the underlying properties of the potential (or potential landscape) by the relation: *U*(**x**) = -ln *P*(**x**) [4, 6–12, 22, 28]. We use this relationship for our non-equilibrium systems (with no detailed balance, or equivalently *F* ≠ -∇*U*) in analogy with equilibrium systems. However, unlike in equilibrium systems where only the steady-state probability is needed to characterize the global properties of the whole system, in non-equilibrium systems both the underlying landscape and the associated flux are essential for characterizing the global steady state properties as well as the dynamics of the protein network.

## 2. Results and discussion

The parameter values are *v*_{
s
}= 0.5 *nMh*^{-1}, *v*_{
m
}= 0.3 *nMh*^{-1}, *v*_{
d
}= 1.5 *nMh*^{-1}, *k*_{
s
}= 2.0 *h*^{-1}, *k*_{1} = 0.2 *h*^{-1}, *k*_{2} = 0.2 *h*^{-1}, *k*_{
m
}= 0.2 *nM, kI* = 2.0 *nM, k*_{
d
}= 0.1 *nM*, *n* = 4. The limit cycle for these values is attractive.

We solve the Fokker-plank equation using both the reflecting boundary condition J = 0 and the absorbing boundary condition. The results are similar; we choose the reflecting boundary condition in this paper. With certain initial conditions(both homogeneous and inhomogeneous), we obtain the steady probability distribution solution *P*_{
ss
}using the finite difference method at the long time limit. Then, we can use *U*(**x**) = -ln *P*_{
ss
}(**x**) to get the generalized non-equilibrium potential function (landscape) of the circadian clock.

*P*(

*M, Pc, P*

_{ N },

*t*→

*∞*) to reduce the dimension to two. We use the formulas:

*U*≤ 25, while

*U*> 25 is transparent. The closed doughnut is around the deterministic solution which represents the lower potential and corresponding higher probability along the oscillation trajectories. The potential is higher (and the probability is lower) outside the doughnut; this means the system is attracted to the doughnut. We found that the potential landscape distributes along the oscillation ring inhomogeneously. The potential is lower for states at which the system stays longer, which is determined by the speed at which the system passes through each state in the averaged deterministic oscillation. So the potential landscape and the steady state probability along the oscillation are not uniform due to the inhomogeneity of the time spent on each state (or the passing speed at that state) of the oscillation ring. As shown in Fig. 3B (right panel), we also observe the doughnut of the potential landscape is thicker or fatter and the values of the potential landscape along the limit cycle become comparable to or even smaller than the outside of the limit cycle when the diffusion coefficient

*D*increases. A further increase in the fluctuations will eventually destroy circadian rhythmicity. This is because the attraction of the limit cycle becomes weaker and the time spent on the limit cycle becomes shorter. The system transforms from a clear, robust oscillation under small fluctuations to no oscillation under high fluctuations.

*M*,

*Pc*, and

*P*

_{ N }over a period are shown in Fig. 4(A). The forces on

*M*,

*Pc*, and

*P*

_{ N }over the period are shown in Fig. 4(B). The speed along the cycle is shown in Fig. 4(C). Fig. 4(D) shows the corresponding limit cycle with the time marks. The sign 'star' on the limit cycle shows where the values of the force and the speed have been denoted at given times. The speed along the limit cycle has two maxima, at which the amount of time spent will be smaller than at other part of the phase space. Thus, the steady probability distribution is larger at the slower speed[1].

The divergence of the flux is equal to zero at steady state. In an equilibrium system, the flux **J** = 0 (detailed balance). But in a non-equilibrium system, the flux is a curl field (*J* = ∇ × *A* in three dimensions where A is a vector field). Fig. 3(A) (left panel) shows the probability flux on the closed ring landscape of the limit cycle. We can see clearly the direction of the flux near the ring is parallel to the oscillation path, like a curl.

So the flux force is the driving force for the oscillation. The potential landscape attracts the system to the closed ring and the flux force keeps the probability flow along the ring, providing the driving force for oscillation. We can see that the flux force plays a more important role along the closed ring than outside the ring because of large ∇*U*-type forces. Therefore, the interplay of the landscape and the flux force is the most important characteristic for a non-equilibrium system.

*D*.

*Barrier*1 is equivalent to

*U*

_{ fix }minus

*U*

_{ max }, and

*Barrier*2 is equivalent to

*U*

_{ fix }minus

*U*

_{ min }, where

*U*

_{ fix }is the potential local maximum inside the limit cycle;

*U*

_{ max }is the potential maximum along the limit cycle; and

*U*

_{ min }is the potential minimum along the limit cycle. We can see the barrier height becomes larger when the fluctuations decrease. It is harder for the system to go from the doughnut of attraction to outside when fluctuations are small. This means the doughnut shape of the landscape is robust, and a stable oscillation is essentially guaranteed for small fluctuations. It also implies that the barrier height can be used as a quantitative measure of the stability and robustness of the network oscillations.

*T*is shown as follows:

*e*

_{ p }= -∫(

*k*

_{ B }

*T*∇ ln

*P*-

**F**)·

**J**

*dx*is the entropy production rate [38], and

*h*

_{ d }= ∫

**F**·

**J**

*dx*is the mean rate of the heat dissipation. For a steady state, $\dot{S}$ = 0, and the entropy production

*e*

_{ p }is equal to the heat dissipation

*h*

_{ d }. In Fig. 6(A), we can see the dissipation (entropy production rate) decrease as the diffusion coefficient characterizing the fluctuations decreases; this shows that robust oscillation with less fluctuation dissipates less energy and is more stable. From Fig. 6(B), we also find that less dissipation leads to higher barrier heights for escaping from the oscillation cycle and therefore a more stable network. So, minimization of the dissipation cost might serve as a design principle for evolution of the network because the entropy production is a global characterization of the circadian network; it is intimately related to the robustness of the network.

*D*can be quantified further by the phase coherence

*ξ*, which measures the degree of periodicity of the time evolution of a given variable[39]. The phase coherence

*ξ*quantitatively measures the degree of persistence of the oscillatory phase, and is defined as follows: First, the vector

*N*(

*t*) =

*n*

_{1}(

*t*)

*e*

_{1}+

*n*

_{2}(

*t*)

*e*

_{2}+

*n*

_{3}(

*t*)

*e*

_{3}is shown in Fig. 7. The unit vectors are

*e*

_{1}= (0, 1),

*e*

_{2}= (-$\sqrt{3}$/2, -1/2) and

*e*

_{3}= (-$\sqrt{3}$/2, 1/2), where

*n*

_{1}(

*t*),

*n*

_{2}(

*t*), and

*n*

_{3}(

*t*) are the concentrations of the three kinds of protein molecules at time

*t*.

*φ*(

*t*) is the phase angle between

*N*(

*t*) and

*N*(

*t*+

*τ*), where

*τ*should be smaller than the deterministic period and larger than the fast fluctuations. We choose

*τ*= 0.2

*k*

^{-1}. The oscillation goes in the positive orientation (counterclockwise), so

*φ*(

*t*) > 0. The formula for

*ξ*is then:

*θ*(

*φ*) = 1 when

*φ*(

*t*) > 0, and

*θ*(

*φ*) = 0 when

*φ*(

*t*) ≤ 0, and sums are taken over every time step for the simulated trajectory.

*ξ*≈ 0 means the system moves stochastically and has no coherence. The oscillation is most coherent when

*ξ*is close to 1. The value of

*ξ*becomes larger when the fluctuations are smaller, since the trajectories are more periodic in their evolution. Fig. 8(A) shows

*ξ*decreases as the the diffusion coefficient increases, implying that the coherence of the oscillation can be destroyed by fluctuations. Conversely, less fluctuation yields a more coherent, robust, and stable system. We also see that

*ξ*becomes larger with a lower heat loss or entropy production rate, and conclude that less dissipation leads to more coherence. We further see that

*ξ*increases with barrier height (Fig. 8(B)). This shows that a less dissipated network tends to preserve the coherence of the oscillations.

*D*to illustrate the robustness of circadian oscillation. We solve the chemical reaction network equations under the fluctuations which reflect external noise. To assess the effect of molecular noise on circadian oscillations, we have used stochastic Brownian dynamics to perform stochastic simulations of the deterministic model governed by equations(1–3). Fig. (9A and 9B) shows the distributions of the period of oscillations calculated for 2000 successive cycles. We can see that, when the fluctuations increase, the distribution becomes more spread out, but the mean period and mean amplitude are still close to the deterministic period of the oscillations. In Fig. 9(C), the standard deviation

*σ*from the mean increases when the fluctuations increase[1, 2]. This implies that less fluctuations lead to more coherent oscillations. We also see that the period distribution becomes less dispersed when the entropy production rate decreases. This shows that a less dissipated network can lead to a more coherent oscillation with a unique period instead of a distribution of periods. In Fig. 9(D), we see that higher barrier heights lead to less dispersed period distributions.

*M*with increasing

*D*. We can see the distribution becomes more dispersed but stays close to the deterministic value as the fluctuations increase in Fig. 10(A). The standard deviation

*σ*increases when

*D*goes up in Fig. 10(B), again showing less fluctuations leading to more robust oscillations.

*q*is the percentage by which the rate constants are increase or decreased. Fig. 11(B) shows the barrier height (solid line) and the entropy production rate (dashed line) versus the six rate parameters. We can see that when the rate parameters

*k*

_{1}and

*v*

_{ m }increase, the barrier height increases and the entropy production rate decreases, as the system becomes more stable and robust. We can also see that when the other four rate parameters (

*k*

_{2},

*k*

_{ s },

*v*

_{ d },

*v*

_{ s }) increase, the barrier height decreases and the entropy production rate increases as the system becomes less stable and robust.

*k*

_{ s }and

*v*

_{ m }to further explore the period and amplitude using stochastic Brownian dynamics, since they represent the largest changes of the barrier height from the increasing the rate parameters. Fig. 12(A) shows the amplitude distribution for different

*k*

_{ s }rates. Fig. 12(B) shows the amplitude center and the standard deviation

*σ*both decrease when the rate

*k*

_{ s }increase. Fig. 12(C) shows the period distribution for different

*k*

_{ s }rates and Fig. 12(D) shows the period center decrease and the standard deviation

*σ*increase when the rate

*k*

_{ s }increase. This implies that the fluctuations in period measured by the variance increase as the

*k*

_{ s }increases. Therefore the network becomes less stable and coherent due to the trend of larger fluctuations.

*v*

_{ m }rate. Fig. 13(B) shows the amplitude center and the standard deviation

*σ*both decrease when the rate

*v*

_{ m }increase. Fig. 13(C) shows the period distribution for different

*v*

_{ m }rate and Fig. 13(D) shows the period center and the standard deviation

*σ*both decrease when the rate

*v*

_{ m }increase. This implies that the fluctuations in period and amplitude measured by the variance decrease as the

*v*

_{ m }increases. Therefore the network become more stable and coherent due to the trend of smaller fluctuations.

### 2.1. Mathematical material

**x**= {

*x*

_{1}(

*t*),

*x*

_{2}(

*t*), ...

*x*

_{ n }(

*t*)} is the concentration vector, with each component of which representing different protein species in the network. The

**F**(

**x**) = {

*F*

_{1}(

**x**),

*F*

_{2}(

**x**), ...

*F*

_{ n }(

**x**)} is the chemical reaction rate flux vector involving the chemical reactions which are often non-linear in protein concentrations

**x**(for example, enzymatic reactions). The equations $\dot{x}$ =

**F**(

**x**) describe the averaged dynamical evolution of the chemical reaction network (see details in the next subsection). As mentioned, in the cell, the fluctuations can be very significant from both internal and external sources [40–44] and in general can not be ignored. A term

*ζ*is added mimicking these fluctuations in an assumed Gaussian distribution (from the large-number theorem in statistics). Then the auto correlations of the noise is given by:

Here *δ*(*t*) is the Dirac delta function and the diffusion matrix *D* is explicitly defined by <*ζ*_{
i
}(*t*)*ζ*_{
j
}(*t'*) >= 2*D*_{
ij
}*δ*(*t* - *t'*). The average <...> is carried out with the Gaussian distribution for the noise. Therefore, we realize that the resulting evolution trajectories of the protein concentrations are stochastic. So instead of following the determinist path of probability equal to one, we now need to quantify the probability of specific paths. The probabilistic description is more appropriate for the system under fluctuating environments. The probabilistic evolution follows a Fokker-Planck diffusion equation as discussed in the main text.

*P*

_{0}(

**x**) for the state variable

**x**(representing the protein concentrations of the protein network in this case). In the equilibrium systems where a potential U where the force is a gradient of it,

*P*

_{0}(

**x**) is exponentially related to potential energy function

*U*(

**x**). So we obtain the information of steady state probability from U. For the non-equilibrium system, we do not know the information of the potential a priori. But we can obtain the information of the steady state probability by solving the probabilistic evolution equation and taking the long time limit. In analogy with the equilibrium system, we can define the generalized potential U for the non-equilibrium case from the steady state probability [4, 6, 7, 9, 10, 22, 28]:

with the partition function *Z* = ∫ *d* **x** exp{-*U*(**x**)}. The rational for the definition of the non-equilibrium potential this way is given earlier in the main text due to the driving force (for the dynamics) decomposition as gradient of a potential and curl flux. From the steady-state distribution function, we can therefore identify U as the generalized potential function of the network system. In this way, we map out the potential landscape. Once we have the potential landscape, we can discuss the global stability of the protein cellular networks.

## 3. Conclusion

We have shown that we can explore the global features of the circadian rhythms model. Finding the potential landscape and associated flux is the key to addressing the robustness issue of the networks. We have uncovered the underlying potential landscape of a circadian clock. This is realized by explicitly constructing the probability of the states of the protein network by solving the corresponding probabilistic diffusion equation. The landscape of the oscillation has an irregular and inhomogeneous closed ring valley or doughnut-like shape. We also found that the flux along the cycle path is the driving force for coherent oscillation. The potential barrier height for escaping from the limit cycle attractor determines the robustness and stability of the network oscillations. We found as the diffusion coefficient becomes smaller, the potential barrier becomes greater, and furthermore the statistical fluctuations are effectively more severely suppressed. This leads to robustness of the biological limit cycle basin of the protein network.

We observe the global dissipation in terms of the entropy production of the whole system increases when the diffusion coefficient *D* increases. The period and the amplitude distribution becomes widely dispersed when *D* increases, and the phase coherence decreases. These are three ways of characterizing the robustness of the oscillation in addition to the barrier height measure from the basin of the attraction. Low entropy production might serve as a design principle for robust networks.

The robustness, coherence, and dissipation of the circadian oscillations with respect to the changes with the rate parameters can be studied as well. And we found protein element k_{s} is crucial in maintaining the robustness in the network.

## Declarations

### Acknowledgements

JW thank the supports from National Science Foundation Career Award and American Chemical Society Petroleum Fund. LX and EKW are supported by the National Natural Science Foundation of China (Grant no. 90713022 and 20735003).

## Authors’ Affiliations

## References

- Gonze D, Halloy J, Goldbeter A: Proc Natl Acad Sci. 2002, 99: 673-678. 10.1073/pnas.022628299.View ArticleADSGoogle Scholar
- Gonze D, Halloy J: J Chem Phys. 2002, 116: 10997-11010. 10.1063/1.1475765.View ArticleADSGoogle Scholar
- Ao P: J Phys A Math Gen. 2004, 37: L25-L30. 10.1088/0305-4470/37/3/L01.View ArticleADSMathSciNetMATHGoogle Scholar
- Zhu XM, Yin L, Hood L, Ao P: Funct Integr Genomics. 2004, 4: 188-195. 10.1007/s10142-003-0095-5.View ArticleGoogle Scholar
- Qian H, Bear DA: Biophys Chem. 2005, 114: 213-220. 10.1016/j.bpc.2004.12.001.View ArticleGoogle Scholar
- Qian H, Reluga TC: Phys Rev Lett. 2005, 94: 028101-10.1103/PhysRevLett.94.028101.View ArticleADSGoogle Scholar
- Wang J, Huang B, Xia XF, Sun ZR: Biophys J Lett. 2006, 91: L54-L56. 10.1529/biophysj.106.086777.View ArticleGoogle Scholar
- Han B, Wang J: Journal Cover Article, Biophys J. 2007, 92: 3755-10.1529/biophysj.106.094821.View ArticleGoogle Scholar
- Wang J, Huang B, Xia XF, Sun ZR: PLOS Comp Biol. 2006, 2: e147-10.1371/journal.pcbi.0020147. 1385.View ArticleADSGoogle Scholar
- Kim KY, Wang J: PLoS Comput Biol. 2007, 3 (3): e60-10.1371/journal.pcbi.0030060.View ArticleADSMathSciNetGoogle Scholar
- Lapidus S, Han B, Wang J: Proc Natl Acad Sci. 2008, 105: 6039-10.1073/pnas.0708708105.View ArticleADSGoogle Scholar
- Wang J, Xu L, Wang EK: Proc Natl Acad Sci. 2008, 105: 12271-10.1073/pnas.0800579105.View ArticleADSGoogle Scholar
- McAdams HH, Arkin A: Proc Natl Acad Sci USA. 1997, 94: 814-819. 10.1073/pnas.94.3.814.View ArticleADSGoogle Scholar
- Elowitz MB, Leibler S: Nature. 2000, 403: 335-338. 10.1038/35002125.View ArticleADSGoogle Scholar
- Swain PS, Elowitz MB, Siggia ED: Proc Natl Acad Sci USA. 2002, 99: 12795-12800. 10.1073/pnas.162041399.View ArticleADSGoogle Scholar
- Thattai M, Van OA: Proc Natl Acad Sci USA. 2001, 98: 8614-8619. 10.1073/pnas.151588598.View ArticleADSGoogle Scholar
- Vilar JMG, Guet CC, Leibler S: J Cell Biol. 2003, 161: 471-476. 10.1083/jcb.200301125.View ArticleGoogle Scholar
- Paulsson J: Nature. 2004, 427: 415-418. 10.1038/nature02257.View ArticleADSGoogle Scholar
- Hasty J, Pradines J, Dolnik M, Collins JJ: Proc Natl Acad Sci USA. 2000, 97: 2075-2080. 10.1073/pnas.040411297.View ArticleADSGoogle Scholar
- Hasty J, Isaacs F, Dolnik M, McMillen D, Collins JJ: Chaos. 2001, 11: 207-220. 10.1063/1.1345702.View ArticleADSMATHGoogle Scholar
- Gardiner CW: Handbook of stochastic methods for physics, chemistry and the natural sciences. Chaos. 1985, Berlin: Springer-Verlag, 475-Google Scholar
- van Kampen NG: Stochastic processes in chemistry and physics. Chaos. 1992, Amsterdam: North-Holland, 419-Google Scholar
- Gillespie DT: J Phys Chem. 1977, 81: 2340-2361. 10.1021/j100540a008.View ArticleGoogle Scholar
- Arkin A, Ross J, McAdams HH: Genetics. 1998, 149: 1633-1649.Google Scholar
- Kepler TB, Elston TC: Biophys J. 2001, 81: 3116-3136. 10.1016/S0006-3495(01)75949-8.View ArticleGoogle Scholar
- Qian H, Saffarian S, Elson EL: Proc Natl Acad Sci. 2002, 99: 10376-10381. 10.1073/pnas.152007599.View ArticleADSMathSciNetMATHGoogle Scholar
- Bialek W: Adv Neural Infor Process. 2003, 13: 103-109.Google Scholar
- Sasai M, Wolynes PG: Proc Natl Acad Sci USA. 2003, 100: 2374-2379. 10.1073/pnas.2627987100.View ArticleADSGoogle Scholar
- Walczak AM, Sasai M, Wolynes PG: Biophys J. 2005, 88: 828-850. 10.1529/biophysj.104.050666.View ArticleGoogle Scholar
- Austin RH, Beeson K, Eisenstein L, Frauenfelder H, Gunsalus I: Biochem. 1975, 14: 5355-5373. 10.1021/bi00695a021.View ArticleGoogle Scholar
- Fisher RA: The genetical theory of natural selection. Biochem. 1930, Oxford: Clarendon, 251-Google Scholar
- Wright S: The roles of mutation, inbreeding, crossbreeding and selection in evolution. Proceedings of the Sixth International Congress on Genetics. 1932, 356-366.Google Scholar
- Delbruck M: Discussion. Unites Biologiques Douees de Continuite Genetique Colloques Internationaux du Centre National de la Recheche Scientifique, Paris: CNRS. 1949Google Scholar
- Waddington CH: Strategy of the gene. Unites Biologiques Douees de Continuite Genetique Colloques Internationaux du Centre National de la Recheche Scientifique. 1957, London: Allen and Unwin, 290-Google Scholar
- Frauenfelder H, Sligar SG, Wolynes PG: Science. 1991, 254: 1598-1603. 10.1126/science.1749933.View ArticleADSGoogle Scholar
- Wolynes PG, Onuchic JN, Thirumalai D: Science. 1995, 267: 1619-1622. 10.1126/science.7886447.View ArticleADSGoogle Scholar
- Wang J, Verkhivker G: Phys Rev Lett. 2003, 90: 188101-10.1103/PhysRevLett.90.188101.View ArticleADSGoogle Scholar
- Qian H: Phy Rev E. 2001, 65: 0161021-0161025.Google Scholar
- Yoda M, Ushikubo WT, Inoue Sasai M: J Chem Phys. 2007, 126: 115101-10.1063/1.2539037. 1–11.View ArticleADSGoogle Scholar
- Davidson EH: Science. 2002, 295: 1669-1673. 10.1126/science.1069883.View ArticleADSGoogle Scholar
- Huang CY, Ferrell JE: Proc Natl Acad Sci. 1996, 93: 10078-10082. 10.1073/pnas.93.19.10078.View ArticleADSGoogle Scholar
- Kholodenko BN: Eur J Biochem. 2000, 267: 1583-1593. 10.1046/j.1432-1327.2000.01197.x.View ArticleGoogle Scholar
- Ideker T: Science. 2001, 292: 929-933. 10.1126/science.292.5518.929.View ArticleADSGoogle Scholar
- Jeong H: Nature. 2000, 407: 651-654. 10.1038/35036627.View ArticleADSGoogle Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.