×

Not just loading and age: the dynamics of osteoarthritis, obesity and inflammation

Body fat is not an inert structure

Obesity is a well recognised risk factor for osteoarthritis (OA).1 It is commonly believed that obesity affects joints through loading. However, there must be additional mechanisms since, for decades, obesity has been known to be a strong risk factor for hand OA. Given that we do not walk on our hands, an effect of obesity through loading of the joints cannot be the whole explanation. An understanding of the potential mechanisms by which obesity affects joints is important for optimising the treatment and prevention of OA.

Work over the past decade using magnetic resonance imaging has enabled the assessment of factors affecting joints across the spectrum of the disease, from normal asymptomatic joints to symptomatic OA.1 This has made it possible to examine the effect of obesity on joints, and to untangle the issue of whether obesity causes OA or whether OA-related pain causes obesity through modification of lifestyle behaviours and consequent weight gain. This work has shown that obesity is a causative factor in the development of OA, with increased weight being associated with early articular cartilage damage, well before symptoms develop.1 Obesity is an important risk factor for OA across a wide range of joints, including hands, back, hip and knee.

Having established the importance of obesity as a causative factor for OA, it is important to consider potential mechanisms and to recognise that measures of obesity, such as weight and body mass index, have limited usefulness because they do not provide information regarding body composition. For example, body composition may be very different in two men with an identical body mass index of 30 kg/m2: one may have a very high proportion of muscle, while the other may have a very high proportion of fat. Several studies have examined the effect of body composition, particularly fat mass, on joint health.1 A large body of evidence has shown that an increase in fat mass is associated with pre-clinical OA.1 Increased fat mass is also associated with faster loss of knee cartilage and an increased likelihood of joint replacement. An increase in fat mass is also associated with more back pain and disability2 and foot pain.3

The findings that increased fat mass is associated with early through to late OA, independent of obesity, suggest that the effect of obesity on the joint may be via metabolically driven inflammation. It is well recognised that body fat is not an inert structure but rather a highly metabolically active tissue that produces inflammatory molecules, including cytokines and adipokines, that have been shown to damage joints.4 Circulating levels of inflammatory cytokines5 and low-grade synovitis are associated with cartilage loss.6 Higher levels of the adipokine leptin are also independently related to increased cartilage loss, suggesting a systemic mechanism for the effect of obesity on knee cartilage.7 Thus the old paradigm of OA being a degenerative, wear-and-tear disease of older age, and not an inflammatory disease, has been challenged.

So what are the implications of these findings? For weight-bearing joints, as obesity affects joints through both mechanical loading and metabolically driven inflammation, the effects are synergistic; the joint that is being loaded, rather than being healthy, is also subjected to low-grade inflammation — a double “hit”. Thus if a patient is carrying 20 kg extra weight, they are not carrying 20 kg of inert fat. The individual is carrying 20 kg of metabolically active tissue that is not only overloading the joint, but also producing inflammatory molecules resulting in a more vulnerable joint being loaded. The inflammatory mechanisms may also contribute to some of the obesity-related risk for non-weight-bearing joints.

The inflammatory mechanism for obesity-related damage to joints highlights the importance of preventing obesity in early life to avoid early joint damage. Such damage sets up a vicious cycle of further joint damage through both inflammation and loading. It may also contribute to the increased risk of cardiovascular disease seen in those with OA.9 Preventing early weight gain is potentially a more achievable and effective option than weight loss in later life.9,10 Once disease is established, weight maintenance may be a more feasible goal than weight loss for minimising pain and structural progression in joints such as the knee.1,9 With our increasingly obese population and its associated burden of osteoarthritis, novel therapies aimed at targeting inflammatory pathways warrant further investigation.

[Perspectives] Medicinal plants—the next generation

Medicinal plants have long had a role in supporting the health of human populations. Our Palaeolithic hunter-gatherer ancestors possessed extensive knowledge of the nutritional-medicinal properties of surrounding vegetation. Archaeological evidence suggests that bands of prehistoric people may have commonly reserved a role for shamans who had knowledge of the location and use of medicinal plants. The beginnings of the shift to agriculture around 12 000 years ago altered human relations with the natural landscape, reducing the biodiversity of plant species used by people.

Low stress resistance leads to type 2 diabetes: study

A recent study published in Diabetologia (the journal of the European Association for the Study of Diabetes) has found 18-year-old men with low stress resistance have a 50% higher risk of developing type 2 diabetes in their lifetime.

The population based study examined all 1,534,425 military conscripts in Sweden during 1969–1997 who underwent psychological assessment to determine stress resilience. They had to have had no previous diagnosis of diabetes.

They were followed up for type 2 diabetes from 1987–2012 with the maximum attained age being 62.

Related: Emergency doctors as stressed as soldiers

After adjusting for body mass index, family history of diabetes, and individual and neighbourhood socioeconomic factors, the research found 34,008 men had been diagnosed with type 2 diabetes.

The study found the 20% of men with the lowest resistance for stress were 51% more likely to have been diagnosed with diabetes than the 20% with the highest resistance to stress.

Authors Dr Casey Crump, Department of Medicine, Stanford University, Stanford, CA, USA, and colleagues in Sweden and the USA admit lifestyle behaviours related to stress including smoking, unhealthy diet and lack of physical activity could be related to the increased risk of diabetes. The study also could not make any assertions about women as it only included male army cadets.

Related: MJA – Preventing type 2 diabetes: scaling up to create a prevention system

The authors conclude: “These findings suggest that psychosocial function and ability to cope with stress may play an important long-term role in aetiological pathways for type 2 diabetes. Additional studies will be needed to elucidate the specific underlying causal factors, which may help inform more effective preventive interventions across the lifespan.”

Latest news:

Potato consumption linked to gestational diabetes

A study published in the BMJ has found a link between a woman’s pre-pregnancy consumption of potatoes and her chances of suffering gestational diabetes.

The researchers from the Eunice Kennedy Shriver National Institute of Child Health and Human Development and Harvard University tracked 15,632 women over a 10-year period, which resulted in 21,693 singleton pregnancies.

Of these pregnancies, 854 were affected by gestational diabetes.

After taking into account risk factors such as age, family history of diabetes, diet quality, physical activity and BMI, researchers found that higher total potato consumption was significantly associated with a risk of gestational diabetes.

Related: Who’s responsible for the care of women during and after a pregnancy affected by gestational diabetes?

The researchers found that if women substituted two servings of potatoes a week with other vegetables, wholegrains or legumes, there is a 9-12% lower risk of contracting gestational diabetes.

They say one explanation of the findings is that potatoes have a high glycaemic index which can trigger a rise in blood sugar levels thanks to the high starch content.

Related: Odds, risks and appropriate diagnosis of gestational diabetes: comment

The most recent Australian dietary guidelines released in 2015 say Australians need to eat less starchy vegetables.

The authors of the study admit that the observational nature of their study means no definite conclusions can be drawn about cause and effect.

However, they conclude: “Higher levels of potato consumption before pregnancy are associated with greater risk of GDM, and substitution of potatoes with other vegetables, legumes, or whole grain foods might lower the risk.”

Latest news:

 

[Case Report Comment] Vitamin A deficiency in adolescents: rare or underdiagnosed?

In The Lancet, Samantha Simkin and colleagues1 report a case of preventable progressive blindness in an adolescent, caused mainly by vitamin A deficiency. Vitamin A deficiency is considered rare in high-income countries, and although the patient had concurrent systemic infections and mononeuropathy, he was diagnosed only 2 years after the appearance of his first symptoms. After extensive expensive investigations, repeated detailed medical history discovered long-term dietary restrictions. The nutritional disturbances, in addition to corneal signs and recurrent severe infections, raised the possibility of vitamin A deficiency.

Chromium supplements linked to carcinogens: research

An Australian research team has found concerns with the long-term use of nutritional supplements containing chromium.

UNSW and University of Sydney researchers say chromium partially converts into a carcinogenic form when it enters cells.

The findings are published in the chemistry journal Angewandte Chemie.

There are primarily two forms of chromium: chromium (III) forms such as trivalent chromium (III) picolinate are sold as nutritional supplements. Hexavalent chromium (VI) is its ‘carcinogenic cousin’.

The team was led by Dr Lindsay Wu from UNSW’s School of Medical Sciences and Professor Peter Lay from the University of Sydney’s School of Chemistry. It treated animal fat cells with chromium (III) in a labatory and created a map of every chemical element contained within the cell using a synchrotron’s X-ray beam.

Related: Supplement claims rejected

“The high energy X-ray beam from the synchrotron allowed us to not only see the chromium spots throughout the cell but also to determine whether they were the carcinogenic form,” said Dr Wu.

“We were able to show that oxidation of chromium inside the cell does occur, as it loses electrons and transforms into a carcinogenic form.

“This is the first time this was observed in a biological sample,” Dr Wu said.

Professor Lay said the finding raises concerns over possible cancer causing possibilities of chromium supplements.

“With questionable evidence over the effectiveness of chromium as a dietary supplement, these findings should make people think twice about taking supplements containing large doses of chromium,” Professor Lay said.

“However, additional research is needed to ascertain whether chromium supplements significantly alter cancer risk.”

Related: Real food, supplements help the elderly stay healthy

There is controversy over whether the dietary form of chromium is essential.

Chromium supplements are sometimes used for the treatment of metabolic disorders however they are also commonly used for weight-loss and body building.

Australia’s current National Health and Medical Research Council Nutrient Reference Values, which are currently under review, recommend 25-35 micrograms of chromium daily as an adequate intake for adults.

Trace amounts of chromium (III) can be found in some foods however these findings are unlikely to apply.

Latest news:

 

 

[Series] Stopping tuberculosis: a biosocial model for sustainable development

Tuberculosis transmission and progression are largely driven by social factors such as poor living conditions and poor nutrition. Increased standards of living and social approaches helped to decrease the burden of tuberculosis before the introduction of chemotherapy in the 1940s. Since then, management of tuberculosis has been largely biomedical. More funding for tuberculosis since 2000, coinciding with the Millennium Development Goals, has yielded progress in tuberculosis mortality but smaller reductions in incidence, which continues to pose a risk to sustainable development, especially in poor and susceptible populations.

Studying the Thenar Eminence of Amateur cooKs (STEAK) study: a double-blinded, cross-sectional study

Steak browning is the result of the protein myoglobin being denatured by heat, and is strongly correlated with heterocyclic amine formation.1 Heterocyclic amines are suspected to be a risk factor in colorectal cancer because of their association with oxidative stress, so that overcooked meats may be carcinogenic.2 On the other hand, the levels of potentially toxic bacteria, including Campylobacter jejuni, Escherichia coli O157:H7, Salmonella spp. and Listeria monocytogenes, rapidly decline the more a steak is cooked.3 At the same time, lean beef has been found to have positive cardiovascular health benefits in that it reduces low-density lipoprotein-cholesterol levels, and should thus not be excluded from a balanced diet.4 As a result, the importance of determining the doneness of a steak is not limited to the fancy of gastronomes, but is an important health question.

Several methods have been developed to assess the doneness of a steak, including the invasive techniques of internal steak temperature monitoring and visual assessment.5 A third, non-invasive technique is the “finger test”, using the thenar eminence of the human hand. The thenar eminence is made up of the abductor pollicis brevis, flexor pollicis brevis and opponens pollicis muscles. This method for determining steak doneness compares the tension of the surface of the steak with that of the thenar eminence while the hand is in different positions. The objective of this study was to determine the accuracy of the finger test.

Methods

This was a double-blinded, cross-sectional study. Ethics approval was obtained from the Monash University Human Research Ethics Committee (MUHREC, approval CF15/441 – 2015000216). Our reporting of this research conforms with the STROBE statement on cross-sectional studies.6

Participant selection

The researchers advertised the research sessions by word of mouth in Melbourne, Australia. Participants were included in the study if they were over 18 years of age, were not a professional cook or enrolled in a course leading to a qualification as a professional cook, and were able to attend a research session during a 14-week study period.

All participants provided written consent before participating in the study. Participant sex and age were recorded, as well as data on how often the participant cooked a hot meal each week and their self-rated steak-cooking ability (on a scale of 0, unable to cook steak, to 10, a master steak cook).

Participants were then provided with written instructions and photographs, and a demonstration of how to conduct the finger test to determine the doneness of the sample steaks. A steak was considered rare if it had the same tenseness as the thenar muscles during a gentle pinch between the thumb and index finger. In a similar manner, a medium-rare, medium or well-cooked steak has the same tenseness as the thenar muscles during a gentle pinch between the thumb and the middle, ring or little fingers respectively (Box 1).

Steak preparation

We used Australian beef porterhouse steaks, purchased from Aldi in packs of four and stored at 4°C. How well the steak was to be cooked was determined by a computer-based random number generator. Each participant in a research session tested the same six steaks in the same order. We collected data on steak weight, cooking time and internal temperature immediately before and after cooking.

All steaks were cooked by one of the authors (TV) in a Crofton non-stick cooking pan (Aldi). The stoves used included gas burner and induction-heated models. Oil and seasoning were not used during cooking. Steak doneness was monitored by assessing the internal steak temperature during cooking, recorded by a wireless grilling thermometer (Bar B Chek, model ET-2213AU, Maverick Industries) with the temperature skewer passing through the long axis of the steak. Steaks were cooked to 40°C and then turned onto the uncooked side. After the steak reached the predetermined temperature, it was removed from the pan. A rare steak was cooked to 53°C, a medium-rare steak to 58°C, a medium steak to 63°C, and well-done steak to 75°C. All steaks rested for at least 2 minutes before being tested by a participant.

Data collection

Participants were isolated before being asked to sequentially estimate the doneness of three steaks using the finger test. After each participant had rated the first three steaks (pre-results), all participants were provided with written feedback on how well each steak had been cooked and how this compared with the participant’s estimates. Participants were then isolated again and asked to sequentially rate three more steaks using the finger test (post-results). Participants were given one steak at a time by the supervising researcher, and were not allowed to alter their response after moving on to the next steak. The supervising researcher and participant were both blinded as to how well the steak had been cooked. All steaks were presented with the first cooked surface face-up to minimise visual cues that may have confounded results.

Outcomes

Our primary aim was to determine whether participants could estimate, better than chance, how well a steak had been cooked, and whether their estimates improved with experience.

Our secondary aims were to determine whether a participant’s estimates were correlated with their age, sex, cooking experience or self-rated steak-cooking ability. We also examined steak-related variables, including weight and total cooking time, and whether participants generally over- or underestimated how well the steaks had been cooked.

Statistical methods

For our primary outcome, we used a χ2 goodness-of-fit test to assess whether participants successfully estimated steak doneness more frequently than would be expected by chance (25%). We compared pre-result and post-result outcomes using the McNemar test. We assumed a Gaussian distribution for our participants’ demographic characteristic and the steak-related variables, and therefore used Pearson correlation coefficients to quantify the correlation between these variables and the proportion of successful estimates.

After collecting the data, we decided to also evaluate whether participants repeatedly under- or overestimated the doneness of the tested steaks. To do this, we assigned values of −3 to +3 to each estimate, expressing the relationship of the estimated doneness with our internal control (eg, an estimate of well-done for a steak that was cooked medium was scored as +1). We then used a one-sample t test to compare pre- and post-results with a theoretical mean of 0 (ie, no difference), and compared pre-result and post-result differences with unpaired t tests.

All analyses were performed with GraphPad PRISM (version 6.0g, GraphPad Software); P < 0.05 (two-tailed) was defined as statistically significant.

Results

Participants

We recruited 27 participants, but one was unable to commence data collection and was excluded from the final analysis. Each participant assessed the doneness of six steaks, resulting in 156 data points. Of our 26 participants, 10 were men (38%) and the median age was 26 years (range, 24–79 years); they each cooked a median of three hot meals per week (range, 0–7) and their median self-rated steak-cooking ability was 5 out of 10 (range, 0–10).

Primary outcomes

Participant accuracy in determining steak doneness is summarised in Box 2. For the pre-result assessments, participants did not correctly estimate the doneness of steaks more frequently than by chance (χ2[1, n = 78] = 2.07; P = 0.15), but were able to estimate doneness better than chance in the post-result stage (χ2[1, n = 78] = 9.04; P < 0.01); the same applied to the overall results (χ2[1, n = 156) = 9.88, P < 0.01). The McNemar test indicated there was no significant improvement between pre- and post-result assessments (P = 0.14).

Secondary outcomes

Correlations of the accuracy in determining steak doneness using the finger test with sex, age, how many times a week a person prepared a hot meal, and the participant’s self-rated steak-cooking ability were not statistically significant (Box 3). The steak’s initial weight and cooked doneness (rare v medium-rare v medium v well-done) were also not statistically correlated with the participant’s probability of correctly estimating doneness (Box 3).

Participants underestimated the doneness of pre-result steaks by an average of 0.56 points (95% CI, −0.85 to −0.28 points; P < 0.001) and post-result steaks by 0.08 points (95% CI, −0.36 to +0.21 points; P = 0.60). The pre-result versus post-result finger test difference was +0.49 points (95% CI, +0.08 to +0.89 points; P < 0.05).

Discussion

Key results

Participants in our study were able to use the finger test to determine, better than chance, how well a steak had been cooked. There was a trend to improvement with practice, as shown by the difference between the pre- and post-result assessments, but this difference was not statistically significant. We did not identify any participant demographic characteristics or steak variables that were correlated with greater accuracy in using the finger test.

Although participants underestimated the doneness of the steaks by 0.56 points during the pre-result stage, this difference did not translate practically into a difference from actual doneness by a whole number interval. The 95% CI for this calculation did not include −1.00, so that it is unlikely that participants were routinely underestimating steak doneness during this stage.

While participants were able to use the finger test to improve the probability that they could determine how well their steak had been cooked, an overall accuracy of only 36% (56 of 156 assessments) shows that its practical application is likely to be limited. In particular, we recommend against readers using the finger test to determine the doneness of steaks for the purpose of returning the steak to the cook for further preparation. In such cases, the reader might find their steak returned, the degree of doneness unchanged, but the steak newly marinated in excess juices from the cook’s anger-provoked sialorrhea.

Limitations

Most of our participants (23 of 26, 88%) were 30 years old or younger. A broader range of ages, particularly staggered towards older participants, might find a different outcome, as older participants are likely to have cooked, on average, many more steaks in their lifetime, a factor that would not have been captured with significance in our analysis.

Many participants in our study commented that it was difficult to determine how well the steak had been cooked because they felt different degrees of doneness in different parts of the steak. Participants were uniformly asked to provide their best estimate using the finger test, but uncontrolled variables, particularly fat content and its distribution, may have reduced the sensitivity of the finger test.

Recommendations

The finger test has shown a small benefit for amateur cooks, and future research should look at its applicability to other types of meat (eg, pork or lamb) and cooking techniques (eg, boiling or grilling).

Given the results of our study, we suggest that amateur cooks and those wishing to reduce their risk of acute food poisoning or potential carcinogen intake continue to use the invasive tests (ie, internal steak temperature or visual assessment) to determine steak doneness.

Box 1 –
Hand positions for determining the doneness of steak using the thenar eminence: A, raw; B, rare; C, medium-rare; D, medium; E, well-done

Box 2 –
Accuracy of steak doneness assessment by the 26 participants

Result of assessment

Pre-test

Post-test

Overall


Incorrect

53

47

100 (64%)

Correct

25

31

56 (36%)

Total

78

78

156 (100%)


Box 3 –
Correlation of demographic characteristics of participants with the their proportions of correct estimates of steak doneness

Pearson r

P


Participant demographics

Sex

0.28

0.16

Age

0.02

0.94

Hot meals per week

0.02

0.93

Self-rated steak-cooking ability

0.03

0.90

Steak variables

Steak weight

0.21

0.32

Steak doneness

−0.30

0.15


[Correspondence] Tackling preventable diseases in Yemen

The health-care system in Yemen has deteriorated since the start of the war in March, 2015. Impairment exists at all levels of health services; from improper function of health-care facilities to a shortage of basic and life-saving needs, such as drugs, water, and fuel. This continuous, unresolved crisis has led to a rise in preventable diseases and other health problems, such as infectious diseases, malnutrition, diarrhoea, and unnecessary organ loss.1,2

Oversleeping linked to increased mortality

It’s not just smoking and high alcohol consumption that we should advise our patients to avoid if they want to live a long life.

A Sydney University study has found that regularly sleeping longer than nine hours a night also can increase the risk of mortality.

The study, published in PLOS One, found that on its own, regular oversleeping meant a 44% increase in risk of death over the six-year study period.

It also found that sitting in a chair for more than seven hours in a 24 hour period can be a big no-no for health.

The researchers gave a lifestyle questionnaire to 231,048 Australians aged 45 years or older who were participating in the Sax Institute’s 45 and up study. The participants were asked to score six health behaviours.

The 6 deadly behaviours are

  • Alcohol consumption
  • Poor diet
  • Inactivity
  • Smoking
  • Spending more than seven hours a day sitting down
  • Sleeping for more than nine hours

Over 90% of the participants had one of the 30 most commonly occurring risk factors and combinations including physical inactivity, sedentary behaviour, and/or long sleep duration. Combinations involving smoking and high alcohol consumption were more highly associated with all-cause mortality.

Dr Melody Ding, one of the study authors, told ABC Radio: “The most intriguing was the 44% risk increase of those who are sleeping more than 9 hours a week. When you combine too much sleep with physical inactivity… then you find the risk for death has increased 149%.

“People who are sleeping too much, sitting a lot and also not being physically active then you’re looking at a combined risk increase of four times.”

Related: MJA – Improving access and equity in reducing cardiovascular risk: the Queensland Health model

Another author, Associate Professor Emmanuel Stamatakis told Fairfax Media: “One of the possible explanations is ‘reverse causality’. Long sleeping times could be indicative of an underlying, undiagnosed disease.”

However he also said the way the survey was written could be a possible explanation:  “In the survey, people were asked ‘How long did you sleep?’ This most likely elicits an answer to the question: ‘How long were you in bed?’

“This says nothing about the quality of the sleep,” Dr Stamatakis said. “So, reported long sleep duration could in fact be indicative of fragmented, restless and poor-quality sleep.”

Related: MJA – Preventing type 2 diabetes: scaling up to create a prevention system

The results founds a person who has all six bad habits is more than five times as likely to die during a six-year period as one who is very clean-living.

Interestingly, high alcohol on its own was the least risky behaviour, with just an 8% increased mortality.

Dr Stamatakis said this shouldn’t give people “licence to drink”.

“General population studies show exactly the opposite result. These show that harmful effects from alcohol start from moderate consumption levels,” he said.

Latest news: