Breast-feeding and preventing obesity
Claim: One of the arguments often made in favor of breast-feeding is that breast-fed babies are less likely to grow into chunky children—and chunky adults—than babies who are formula fed. That’s the official position of several U.S. government agencies, including the Centers for Disease Control and Prevention and the Department of Health and Human Services.
New research: Unfortunately, recent research shows that it’s just not true. The study reported late last month, conducted by Karin Michels of Harvard Medical School and her co-workers, looked at the weight status—normal, overweight, or obese—of about 35,000 registered nurses, each weighed several times since 1989. The researchers also asked the women’s mothers whether the women studied were fully breast-fed, fully formula-fed, or partially breast-fed. Michels’ team then looked to see if there was a correlation between duration of breast-feeding and protection from obesity later in life.
Findings: The results showed a small tendency toward leanness at age 5 in the nurses who were exclusively breast-fed for more than six months. But this small tendency did not persist into adulthood, when there was no relationship between weight and being breast-fed as an infant: The breast-fed women were exactly as likely to be overweight as the formula-fed ones.
Reflections: Pediatricians won’t be surprised at the finding that 5-year-olds who had been exclusively breast-fed for an extended period were a little leaner than girls who had been partly or completely formula-fed. The early growth curves show that formula-fed babies almost always gain weight a little faster (which sometimes makes breast-feeding mothers a bit anxious). Many explanations have been advanced for this; it likely has to do with a natural limitation in the amount of calories a nursing mother can supply from breast milk alone. There is no evidence of either harm or benefit from the observed difference in growth rates.
The surprise for doctors is that the leanness doesn’t carry beyond childhood, since for years we have all been taught otherwise. Also, mothers who breast-feed exclusively and mothers who formula-feed tend to differ socioeconomically, and it’s surprising those differences weren’t reflected in their daughters’ weight status (since obesity rates are higher among the poor).
Conclusion: I feel bad that once again I am in the awkward position of pointing out that a long-cherished belief about breast-feeding doesn’t appear to hold up. I favor breast-feeding, and I know that some people worry that this study may weaken some women’s resolve to do it (that was the reason I got some really mean mail the last time). But I have a strong belief that everyone deserves access to all the facts so they can make properly informed decisions. Hiding or disguising inconvenient ones in support of an agenda, even one that I agree with, is unacceptable.
The new no-period Pill
Question: The ambivalent feelings that many women have toward their periods are brightly illuminated in the debate about Lybrel, a soon-to-be-released oral contraceptive that eliminates menstruation. I have no intention of weighing in on the attendant psychological, cultural, or endocrinological issues (for that, read this Slate piece by Sarah Richards). I’ll answer just one question: How do you square taking Lybrel with the warning that you need to get your period every month to decrease your risk of uterine cancer? That’s what some gynecologists (well, at least one) instruct their patients, I’ve been told.
Higher risk: The answer is simple: The directive is wrong. The risk of uterine cancer (usually cancer of the endometrium, the lining of the uterus) has nothing to do with whether you have a period every month. Instead, the risk seems to mainly depend on the degree to which the lining of the uterus is exposed to estrogen, with more total exposure correlating with more risk. Women whose periods began early in life and women who reached menopause later are at a slightly increased risk because their uterus has a somewhat longer-than-average period of exposure to estrogen. Women who were never pregnant are at somewhat greater risk because, during pregnancy, higher levels of progesterone decrease the body’s level of estrogen and balance its effect. Overweight women are also at somewhat greater risk because excess fat increases estrogen levels.
Lower risk: Taking birth-control pills for at least five years, on the other hand, decreases the risk for uterine cancer. This may be because the estrogen dose in most oral contraceptives is pretty low and because it is balanced by the progesteronelike hormone also contained in the Pill. (Similarly, women who use post-menopausal hormone-replacement treatment that contains only estrogen are at much higher risk for developing uterine cancer than post-menopausal women who take pills combining estrogen and some balancing progesteronelike hormone.)
Conclusion: My guess is that, whatever the other pros and cons, women using Lybrel will probably turn out to be at slightly lower risk for uterine cancer—but I have to stress that this is only a guess.
History: Schizophrenia, which often develops or first manifests late in adolescence, is typically characterized by dramatic misperceptions of reality. Often, there is an early “prodromal” period in which suggestive signs of the illness are seen but the actual disease has not yet appeared. In the past decade, researchers tried to abort the development of schizophrenia with medication during the prodomal phase. This approach was first reported about five years ago by Alison Yung and Patrick McGorry at the University of Melbourne in Australia and shortly after by Thomas McGlashan at Yale. Both research groups identified patients with prodromal symptoms, then treated some of them early on with second-generation (often called “atypical”) antipsychotic drugs. Follow-up studies showed that the early treated patients had a much lower rate of development of schizophrenia than patients not treated with active antipsychotic medication.
Problem: But there was a catch. (Isn’t there always one?) Some of the patients with prodromal symptoms never went on to develop active schizophrenia, with or without treatment. And some of those treated developed the typical significant side effects from the drugs—large weight gain, for example, and a risk of diabetes or other problems with sugar metabolism. This raised an ethical quandary: Was it right to expose patients to a potentially risky treatment if they might not need it and might suffer side effects? Faced with this dilemma, most researchers stopped advocating treatment for patients with prodromal symptoms and instead tried to establish which prodromal symptom is most likely to accurately predict the development of the disorder.
New research: But what if we were able to treat prodromal patients with a medication free of the side effects associated with antipsychotics? Just that possibility has been raised by some research recently reported by Barbara Cornblatt of the Albert Einstein School of Medicine. Her team studied a group of 48 adolescents who have prodromal symptoms. They were treated by a psychiatrist who was not involved in the research and who had decided to give them either antidepressants or one of the newer “atypical” antipsychotic drugs. Because the prescribing psychiatrists were free to pick the kind of drug they thought would work best, we can compare results only by class of drug, not by individual medication. Nine different antidepressants—mostly SSRIs—were used for patients in the antidepressant group and four different second-generation antipsychotics were used to treat patients in the other group.
Findings: The study was not designed to compare the value of these two kinds of medication, since the assignment to the treatment groups was not random. Still, it is hard not to notice that none of the 20 patients treated with the antidepressant went on to develop a psychotic illness, whereas 12 of the 28 patients treated with the antipsychotic (43 percent) developed schizophrenia despite the treatment. We need to be careful—perhaps all this means is that the patients already verging on schizophrenia were more likely to be started on an antipsychotic drug. But if this finding holds up in future studies in which patients are randomly assigned to treatment groups, it would be tremendously exciting. Perhaps, for the first time, we’ll have a safe and effective method to avert the development of schizophrenia.