Same Again Please

 

It’s often said that every family has its secret — Uncle Fred’s fondness for the horses, Cousin Bertha’s promiscuity, etc. — whatever it is that ‘we don’t talk about.’ If that’s true the scientific community is no exception. For us the unutterable is reproducibility — meaning you’ve done an experiment, new in some way, but the key questions are: ‘Can you do it again with the same result?’ and, even more important: ‘Can someone else repeat it?’

Once upon a time in my lab we had a standing joke: whoever came bounding along shouting about a new result would be asked ‘How reproducible is it?’ Reply: ‘100%!’ Next question: ‘How often have you done the experiment?’ Reply: ‘Once!!’ Boom, boom!!!

Not a rib-tickler but it did point to the knottiest problem in biological science namely that, when you start tinkering with living systems, you’re never fully in control.

How big is the problem?

But, as dear old Bob once put it, The Times They Are a-Changin’. Our problem was highlighted in the cancer field by the Californian biotechnology company Amgen who announced in 2012 that, over a 10 year period, they’d selected 53 ‘landmark’ cancer papers — and failed to replicate 47 of them! Around the same time a study by Bayer HealthCare found that only about one in four of published studies on potential drug targets was sufficiently strong to be worth following up.

More recently the leading science journal Nature found that almost three quarters of over 1,500 research scientists surveyed had tried to replicate someone else’s experiment and failed. It gets worse! More than half of them owned up to having failed to repeat one of their own experiments! Hooray! We have a result!! If you can’t repeat your own experiment either you’re sloppy (i.e., you haven’t done exactly what you did the first time) or you’ve highlighted the biological variability in the system you’re studying.

If you want an example of biological variation you need look no further than human conception and live births. Somewhere in excess of 50% of fertilized human eggs don’t make it to birth. In other words, if you do a ‘thought experiment’ in which a group of women carry some sort of gadget that flags when one of their eggs is fertilized, only between one in two and one in five of those ‘flagged’ will actually produce an offspring.

However you look at it, whether it’s biological variation, incompetence or plain fraud, we have a problem and Nature’s survey revealed that, to their credit, the majority of scientists agreed that there was a ‘significant crisis.’

The results of the survey by Nature from Baker 2016.

Predictably, but disturbingly for us in the biomedical fields, the greatest confidence in published results was shown by the chemists and physicists whereas only 50% of data in medicine were thought to be reproducible. Oh dear!

Tackling the problem in cancer

The Reproducibility Project: Cancer Biology, launched in 2013, is a collaboration between the Center for Open Science and Science Exchange.

The idea was to take 50 cancer papers published in leading journals and to attempt to replicate their key findings in the most rigorous manner. The number was reduced from 50 to 29 papers due to financial constraints and other factors but the aim remains to find out what affects the robustness of experimental results in preclinical cancer research.

It is a formidable project. Before even starting an experiment, the replication teams devised detailed plans, based on the original reports and, as the result of many hours effort, came up with a strategy that both they and the original experimenters considered was the best they could carry out. The protocols were then peer reviewed and the replication plans were published before the studies began.

Just to give an idea of the effort involved, a typical replication plan comprises many pages of detailed protocols describing reagents, cells and (where appropriate) animals to be used, statistical analysis and any other relevant items, as well as incorporating the input from referees.

The whole endeavor is, in short, a demonstration of scientific practice at its best.

To date ten of these replication studies have been published.

How are we doing?

The critical numbers are that 6 of the 10 replications ‘substantially reproduced’ the original findings, although in 4 of these some results could not be replicated. In 4 of the 10 replications the original findings were not reproduced.

The first thing to say is that a 60% rate of ‘substantial’ successful replication is a major improvement on the 11% to 25% obtained by the biotech companies. The most obvious explanation is that the massive, collaborative effort to tighten up the experimental procedures paid dividends.

The second point to note is that even when a replication attempt fails it cannot be concluded that the original data were wrong. The discrepancy may merely have highlighted how fiendishly tricky biological experimentation can be. The problem is that with living systems, be they cells or animals, you never have complete control. Ask anyone who has a cat.

More likely, however, than biological variation as a cause of discrepancies between experiments is human variation, aka personal bias.

This may come as a surprise to some but, rather than being ‘black and white’ much of scientific interpretation is subjective. Try as I might, can I be sure that in, say, counting stained cells I don’t include some marginal ones because that fits my model? OK: the solution to that is get someone else to do the count ‘blind’ — but I suspect that quite often that’s not done. However, there are even trickier matters. I do half a dozen repeats of an experiment and one gives an odd result (i.e., differs from the other five). Only I can really go through everything involved (from length of coffee breaks to changes in reagent stocks) and decide if there are strong enough grounds to ignore it. I do my best to avoid personal bias but … scientists are only human (fact!).

A closer look at failure

One of the failed replications is a particularly useful illustration for this blog. The replication study tackled a 2012 report that bacterial infection (specifically a bacterium, Fusobacterium nucleatum, that occurs naturally in the human oral cavity) is present in human colon cancers but not in non-cancerous colon tissues. It hit the rocks. They couldn’t detect F. nucleatum in most tumour samples and, when they did, the number of bugs was not significantly different to that in adjacent normal tissue.

Quite by chance, a few months ago, I described some more recent research into this topic in Hitchhiker or Driver?

I thought this was interesting because it showed that not only was F. nucleatum part of the microbiome of bowel cancer but that when tumour cells spread to distant sites (i.e., underwent metastasis) the bugs went along for the ride — raising the key question of whether they actually helped the critical event of metastasis.

So this latest study was consistent with the earlier result and extended it — indeed they actually showed that antibiotic treatment to kill the bugs slowed the growth of human tumour cells in mice.

Where does that leave us?

Well, helpfully, the Reproducibility Project also solicits comments from independent experts to help us make sense of what’s going on. Step forward Cynthia Sears of The Johns Hopkins Hospital. She takes the view that, although the Replication Study didn’t reproduce the original results, the fact that numerous studies have already found an association between F. nucleatum and human colon cancer means there probably is one — consistent with the work described in Hitchhiker or Driver?

One possible explanation for the discrepancy is that the original report studied colon tissue pairs (i.e., tumour and tumour-adjacent tissues) from colon cancer patients but did not report possibly relevant factors like age, sex and ethnicity of patients. In contrast, the replication effort included samples from patients with cancer (tumour and adjacent tissue) and non-diseased control tissue samples from age, sex and ethnicity matched individuals.

So we now know, as Dr. Sears helpfully remarks, that the association between F. nucleatum bugs and human colon cancer is more complicated first appeared! Mmm. And, just in case you were in any doubt, she points out that we need to know more about the who (which Fusobacterium species: there are 12 of them known), the where (where in the colon, where in the world) and the how (the disease mechanisms).

Can we do better?

In the light of all that the obvious question is: what can we do about the number of pre-clinical studies that are difficult if not impossible to reproduce? Answer, I think: not much. Rather than defeatist this seems to me a realistic response. There’s no way we could put in place the rigorous scrutiny of the Reproducibility Project across even a fraction of cancer research projects. The best we can do is make researchers as aware as possible of the problems and encourage them towards the very best practices — and assume that, in the end, the solid results will emerge and the rest will fall by the wayside.

Looking at the sharp end, it’s worth noting that, if you accept that some of the variability in pre-clinical experiments is down to the biological variation we mentioned above, it would at least be consistent with the wide range of patient responses to some cancer treatments. The reason for that, as Cynthia Sears didn’t quite put it, is that we just don’t know enough about how the humans we’re tinkering with actually work.

References

Baker, M. (2016). Is There a Reproducibility Crisis? Nature 533, 452-454.

Jarvis, G.E. (2017). Early embryo mortality in natural human reproduction: What the data say [version 2; referees: 1 approved, 2 approved with reservations] F1000Research 2017, 5:2765 (doi: 10.12688/f1000research.8937.2).

Monya Baker & Elie Dolgin (2017). Cancer reproducibility project releases first results. Nature 541, 269–270. doi:10.1038/541269a.

Begley, C.G. and Ellis, L.M. (2012). Drug development: Raise standards for preclinical cancer research. Nature 483, 531–533.

Prinz,F., Schlange,T. and Asadullah, K. (2011). Believe it or not: how much can we rely on published data on potential drug targets? NatureRev. Drug Discov. 10, 712.

Advertisements

Hitchhiker Or Driver?

 

It’s a little while since we talked about what you might call our hidden self — the vast army of bugs that colonises our nooks and crannies, especially our intestines, and that is essential to our survival.

In Our Inner Self we noted that these little guys outnumber the human cells that make up the body by about ten to one. Actually that estimate has recently been revised — downwards you might be relieved to hear — to about 1.3 bacterial cells per human cell but it doesn’t really matter. They are a major part of what’s called the microbiome — a vast army of microorganisms that call our bodies home but on which we also depend for our very survival.

In our personal army there’s something like 700 different species of bacteria, with thirty or forty making up the majority. We upset them at our peril. Artificial sweeteners, widely used as food additives, can change the proportions of types of gut bacteria. Some antibiotics that kill off bacteria can make mice obese — and they probably do the same to us. Obese humans do indeed have reduced numbers of bugs and obesity itself is associated with increased cancer risk.

In it’s a small world we met two major bacterial sub-families, Bacteroidetes and Firmicutes, and noted that their levels appear to affect the development of liver and bowel cancers. Well, the Bs & Fs are still around you’ll be glad to know but in a recent piece of work the limelight has been taken by another bunch of Fs — a sub-group (i.e. related to the Bs & Fs) called Fusobacterium.

It’s been known for a few years that human colon cancers carry enriched levels of these bugs compared to non-cancerous colon tissues — suggesting, though not proving, that Fusobacteria may be pro-tumorigenic. In the latest, pretty amazing, installment Susan Bullman and colleagues from Harvard, Yale and Barcelona have shown that not merely is Fusobacterium part of the microbiome that colonises human colon cancers but that when these growths spread to distant sites (i.e. metastasise) the little Fs tag along for the ride! 

Bacteria in a primary human bowel tumour.  The arrows show tumour cells infected with Fusobacteria (red dots).

Bacteria in a liver metastasis of the same bowel tumour.  Though more difficult to see, the  red dot (arrow) marks the presence of bacteria from the original tumour. From Bullman et al., 2017.

In other words, when metastasis kicks in it’s not just the tumour cells that escape from the primary site but a whole community of host cells and bugs that sets sail on the high seas of the circulatory system.

But doesn’t that suggest that these bugs might be doing something to help the growth and spread of these tumours? And if so might that suggest that … of course it does and Bullman & Co did the experiment. They tried an antibiotic that kills Fusobacteria (metronidazole) to see if it had any effect on F–carrying tumours. Sure enough it reduced the number of bugs and slowed the growth of human tumour cells in mice.

Growth of human tumour cells in mice. The antibiotic metronidazole slows the growth of these tumour by about 30%. From Bullman et al., 2017.

We’re still a long way from a human therapy but it is quite a startling thought that antibiotics might one day find a place in the cancer drug cabinet.

Reference

Bullman, S. et al. (2017). Analysis of Fusobacterium persistence and antibiotic response in colorectal cancer. Science  358, 1443-1448. DOI: 10.1126/science.aal5240

Through the Smokescreen

For many years I was lucky enough to teach in a cancer biology course for third year natural science and medical students. Quite a few of those guys would already be eyeing up research careers and, within just a few months, some might be working on the very topics that came up in lectures. Nothing went down better, therefore, than talking about a nifty new method that had given easy-to-grasp results clearly of direct relevance to cancer.

Three cheers then for Mikhail Denissenko and friends who in 1996 published the first absolutely unequivocal evidence that a chemical in cigarette smoke could directly damage a bit of DNA that provides a major protection against cancer. The compound bound directly to several guanines in the DNA sequence that encodes P53 – the protein often called ‘the guardian of the genome’ – causing mutations. A pity poor old Fritz Lickint wasn’t around for a celebratory drink – it was he, back in the 1930s, that first spotted the link between smoking and lung cancer.

This was absolutely brilliant for showing how proteins switched on genes – and how that switch could be perturbed by mutations – because, just a couple of years earlier, Yunje Cho’s group at the Memorial Sloan-Kettering Cancer Center in New York had made crystals of P53 stuck to DNA and used X-rays to reveal the structure. This showed that six sites (amino acids) in the centre of the P53 protein poked like fingers into the groove of double-stranded DNA.

x-ray-picCentral core of P53 (grey ribbon) binding to the groove in double-stranded DNA (blue). The six amino acids (residues) most commonly mutated in p53 are shown in yellow (from Cho et al., 1994).

So that was how P53 ‘talked’ to DNA to control the expression of specific genes. What could be better then, in a talk on how DNA damage can lead to cancer, than the story of a specific chemical doing nasty things to a gene that encodes perhaps the most revered of anti-cancer proteins?

The only thing baffling the students must have been the tobacco companies insisting, as they continued to do for years, that smoking was good for you.

And twenty-something years on …?

Well, it’s taken a couple of revolutions (scientific, of course!) but in that time we’ve advanced to being able to sequence genomes at a fantastic speed for next to nothing in terms of cost. In that period too more and more data have accumulated showing the pervasive influence of the weed. In particular that not only does it cause cancer in tissues directly exposed to cigarette smoke (lung, oesophagus, larynx, mouth and throat) but it also promotes cancers in places that never see inhaled smoke: kidney, bladder, liver, pancreas, stomach, cervix, colon, rectum and white blood cells (acute myeloid leukemia). However, up until now we’ve had very little idea of what, if anything, these effects have in common in terms of molecular damage.

Applying the power of modern sequencing, Ludmil Alexandrov of the Los Alamos National Lab, along with the Wellcome Trust Sanger Institute’s Michael Stratton and their colleagues have pieced together whole-genome sequences and exome sequences (those are just the DNA that encode proteins – about 1% of the total) of over 5,000 tumours. These covered 17 smoking-associated forms of cancer and permitted comparison of tobacco smokers with never-smokers.

Let’s hear it for consistent science!

The most obvious question then is do the latest results confirm the efforts of Denissenko & Co., now some 20 years old? The latest work found that smoking could increase the mutation load in the form of multiple, distinct ‘mutational signatures’, each contributing to different extents in different cancers. And indeed in lung and larynx tumours they found the guanine-to-thymine base-pair change that Denissenko et al had observed as the result of a specific chemical attaching to DNA.

For lung cancer they concluded that, all told, about 150 mutations accumulate in a given lung cell as a result of smoking a pack of cigarettes a day for a year.

Turning to tissues that are not directly exposed to smoke, things are a bit less clear. In liver and kidney cancers smokers have a bigger load of mutations than non-smokers (as in the lung). However, and somewhat surprisingly, in other smoking-associated cancer types there were no clear differences. And even odder, there was no difference in the methylation of DNA between smokers and non-smokers – that’s the chemical tags that can be added to DNA to tune the process of transforming the genetic code into proteins. Which was strange because we know that such ‘epigenetic’ changes can occur in response to external factors, e.g., diet.

What’s going on?

Not clear beyond the clear fact that tissues directly exposed to smoke accumulate cancer-driving mutations – and the longer the exposure the bigger the burden. For tissues that don’t see smoke its effect must be indirect. A possible way for this to happen would be for smoke to cause mild inflammation that in turn causes chemical signals to be released into the circulation that in turn affect how efficiently cells repair damage to their DNA.

raleighs_first_pipe_in_england-jpeg

Sir Walt showing off on his return                         to England

Whose fault it is anyway?

So tobacco-promoted cancers still retain some of their molecular mystery as well as presenting an appalling and globally growing problem. These days a popular pastime is to find someone else to blame for anything and everything – and in the case of smoking we all know who the front-runner is. But although Sir Walter Raleigh brought tobacco to Europe (in 1578), it had clearly been in use by American natives long before he turned up and, going in the opposite direction (à la Marco Polo), the Chinese had been at it since at least the early 1500s. To its credit, China had an anti-smoking movement by 1639, during the Ming Dynasty. One of their Emperors decreed that tobacco addicts be executed and the Qing Emperor Kangxi went a step further by beheading anyone who even possessed tobacco.

And paying the price

And paying the price

If you’re thinking maybe we should get a touch more Draconian in our anti-smoking measures, it’s worth pointing out that the Chinese model hasn’t worked out too well so far. China’s currently heading for three million cancer deaths annually. About 400,000 of these are from lung cancer and the smoking trends mean this figure will be 700,000 annual deaths by 2020. The global cancer map is a great way to keep up with the stats of both lung cancer and the rest – though it’s not for those of a nervous disposition!

References

Denissenko, M.F. et al. ( (1996). Preferential Formation of Benzo[a]pyrene Adducts at Lung Cancer Mutational Hotspots in P53.Science 274, 430–432.

Cho, Y. et al. (1994). Crystal Structure of a p53 Tumor Suppressor-DNA Complex: Understanding Tumorigenic Mutations. Science, 265, 346-355.

Alexandrov, L.D. et al. (2016). Mutational signatures associated with tobacco smoking in human cancer. Science 354, 618-622.

Fancy that?

Seeing as they started 28 years ago we can hardly blame members of the Harvard School of Public Health for publishing the results of their labours in tracking 120,000 people, asking them every few years what they’ve eaten and seeing what happened to them (a ‘prospective’ study). About one in five of the subjects died while this was going on but the message to emerge was that eating red meat contributes to cardiovascular disease, cancer and diabetes. The diabetes is non-insulin-dependent diabetes mellitus (NIDDM) or adult-onset diabetes – about 90% of diabetes cases. The cancers weren’t specified, although the evidence for a dietary link is generally strongest for colon carcinoma. The risk is a little higher for processed red meat than unprocessed.

How much?

Massive, if you mean the amount of data they accumulated from such a huge sample size followed over many years. If you mean on a plate, their standard serving size was 85 grams (3 ounces) for unprocessed beef, pork or lamb) and 2 slices of bacon or a hot dog for processed red meat. One of those a day and your risk of dying from heart disease is increased by about 20 per cent and from cancer by about 10 per cent – and the risks are similar for men and women. Just to be clear, that is a daily consumption – and the authors very honestly acknowledge that ‘measurement errors inherent in dietary assessments were inevitable’. They also mentioned that one or two things other than steak can contribute to our demise.

Are we any wiser?

If you recall from Rasher Than I Thought? the risk of pancreatic cancer is increased by just under 20 per cent if you eat 50 grams of processed meat every day. This report suggests that a limit of 1.5 ounces (42 grams) a day of red meat (one large steak a week) could prevent around one in 10 early deaths. So does it tell us anything new? Not really. Was it worth doing? Yes, because it adds more solid data to that summarized in Are You Ready To Order?

And the message?

Unchanged. Do some exercise and eat a balanced diet – just in case you’ve forgotten, that means limit the amount of red meat (try fish, poultry, etc.), stick with the ‘good carbs’ (vegetables, fruits, whole grains, etc.), cut out the ‘bad’ (sugar – see Biting the Bitter Bullet), eat fishy fats not sat. fats and, to end on a technical note, don’t pig out.

 References

Pan A, Sun Q, Bernstein AM; et al. Red meat consumption and mortality: results from 2 prospective cohort studies [published online March 12, 2012]. Arch Intern Med. doi:10.1001/archinternmed.2011.2287.

Pan A, Sun Q, Bernstein AM; et al. Red meat consumption and risk of type 2 diabetes: 3 cohorts of US adults and an updated meta-analysis. Am J Clin Nutr. 2011;94(4):1088-1096.