Team:Edinburgh/Modelling

From 2011.igem.org

Revision as of 14:09, 5 September 2011 by Allancrossman (Talk | contribs)

Modelling

Since we come from different backgrounds (our team consists of biologists, engineers, a designer, a social scientist and a computer scientist), our views on modelling differ. Each of us would prefer approaches and techniques close to their home disciplines.

We thought that we could turn our interdisciplinarity into our strength in modelling. Therefore, while collaborating, we tried to approach modelling from different perspectives:

  1. Deterministic modelling - MATLAB
  2. stochastic rule-based modelling (Kappa and Spatial Kappa)
  3. C

Additionally, we have developed other models / tools:

  1. modelling phage dynamics
  2. artificial selection
  3. genetic stability

Contents

Comparison of different modelling tools

Within the reactor tasked with degrading cellulose into glucose in the biorefinery, temperature, enzyme concentration, substrate reactivity as well as xylose, cellobiose and glucose inhibition all govern the amount of glucose product. Deterministic modelling using a set of ordinary differential equations highlights the essential kinetic relationship among the enzymes, exo/endo-glucanase and β-glucosidase. By solving these governing equations using the numerical tool MATLAB the level of degradation is qualitatively predicted.

However, we found that the equations we used for the deterministic modelling only gave sensible answers when the model parameters remained within certain limits. Outside those limits, results could be physically impossible; e.g. producing negative amounts of cellobiose.

As an alternative, stochastic models were created using the Kappa language tool. These incorporate indeterminacy in the evolution of the state of the system. Rules are defined which describe how the model moves from one state to the next.

What is modelling?

Modelling is, in our understanding, vital to the future of Synthetic Biology. Its function is to allow you to design biological circuits, systems etc. without having to go into a lab. This lets the designer abstract away from biological representations and think at a more abstract level.

Right now how things work is:

  1. you get an idea
  2. you go to a lab and see whether it works

How it might work in the future:

  1. you get an idea
  2. you model the behaviour you want to achieve and see if the system might work
  3. you go to a lab and see how it works in reality
  4. you improve your idea, back to step 2...

Compare a biologist to an aircraft designer: the latter has far more tools to simulate the behaviour of their product, before having to build a costly prototype, the former has to perform many experiments, which might not cost much, but are time-consuming and tedious.

Synergy vs. non-synergy

Does the whole theory behind our projects - synergy - even work? What if enzymes were just secreted into the media instead?

We can create ordinary differential equations, to compare and contrast synergy with the enzymes attached onto the cell, the phage and without (non-synergy). This would involve creating a control. We could assume date/results however the whole point of mathematical modelling is to see if using synergy is better than the status quo.

As engineers we have done matlab to 'death', so probably the best programme to use especially for the mathematical modelling i.e ODE's. Next step is to create equations, as well as function and script files. So when we have data its just a point of plugging it in. Important point to note, the possibility of it being stochastic i.e. the behaviour is non-deterministic...

Energy efficiency

Is display of enzymes any good, even in theory?

Consider this: for a bacteria to produce phage or INP requires energy. This energy could have been spent producing extra copies of the cellulases. In order for the phage and cell display projects to make sense, the benefits of synergy must outweigh the cost of producing all these extra proteins.

This question can probably be investigated using simple maths and back-of-envelope calculations...

Spatial modelling

Since synergy is all about putting the right enzymes together in close proximity, we should model how the enzymes will move around in the medium. We can model expression of enzymes on INP and on M13 phages.

Evolutionary analysis of cell-display vs. secretion?

This idea is broadly suggested by Van Zyl et al (2007).

One potential benefit of attaching enzymes to the cell surface rather than secreting them into the media is that any mutations that increase enzyme efficiency will specifically benefit the cell with the mutation, as the increased sugar yield will be physically present at the cell. The mutation will thus confer a fitness advantage, potentially allowing it to take over the culture.

By contrast, if a cell produces a secreted protein that is of higher efficiency, it will disperse and benefit random cells in the culture.

Synonymous codon usage to avoid recombination

Our projects involve having multiple fusion proteins expressed, each of which uses a genetically identical carrier protein (e.g. ice-nucleation protein or the M13 pVIII gene). The presence of repeated sequences in DNA (i.e. the same sequence in multiple locations) can lead to genetic instability.

To combat this, it ought to be possible to design and synthesise different versions of the genes that code for the same amino acids but use different codons and so are as distinct as possible. This could be investigated by computer.

Hooray a solution is here!

References