Against Sunscreen Absolutism

Moderate sun exposure can be good for you. Why won’t American experts acknowledge that?

Against Sunscreen Absolutism

Australia is a country of abundant sunshine, but the skin of most Australians is better adapted to gloomy England than the beaches of Brisbane. The country’s predominantly white population has by far the world’s highest rate of skin cancer, and for years the public-health establishment has warned residents about the dangers of ultraviolet light. A 1980s ad campaign advised Australians to “Slip, Slop, Slap”—if you had to go out in the sun, slip on a shirt, slop on some sunscreen, and slap on a hat. The only safe amount of sun was none at all.

Then, in 2023, a consortium of Australian public-health groups did something surprising: It issued new advice that takes careful account, for the first time, of the sun’s positive contributions. The advice itself may not seem revolutionary—experts now say that people at the lowest risk of skin cancer should spend ample time outdoors—but the idea at its core marked a radical departure from decades of public-health messaging. “Completely avoiding sun exposure is not optimal for health,” read the groups’ position statement, which extensively cites a growing body of research. Yes, UV rays cause skin cancer, but for some, too much shade can be just as harmful as too much sun.

It’s long been known that sun exposure triggers vitamin D production in the skin, and that low levels of vitamin D are associated with increased rates of stroke, heart attack, diabetes, cancer, Alzheimer’s, depression, osteoporosis, and many other diseases. It was natural to assume that vitamin D was responsible for these outcomes. “Imagine a treatment that could build bones, strengthen the immune system and lower the risks of illnesses like diabetes, heart and kidney disease, high blood pressure and cancer,” The New York Times wrote in 2010. “Some research suggests that such a wonder treatment already exists. It’s vitamin D.” By 2020, more than one in six adults were on that wonder treatment in the form of daily supplements, which promise to deliver the sun’s benefits without its dangers.

But sunlight in a pill has turned out to be a spectacular failure. In a large clinical trial that began in 2011, some 26,000 older adults were randomly assigned to receive either daily vitamin D pills or placebos, and were then followed for an average of five years. The study’s results were published in The New England Journal of Medicine two years ago. An accompanying editorial, with the headline “A Decisive Verdict on Vitamin D Supplementation,” noted that no benefits whatsoever had been found for any of the health conditions that the study tracked. “Vitamin D supplementation did not prevent cancer or cardiovascular disease, prevent falls, improve cognitive function, reduce atrial fibrillation, change body composition, reduce migraine frequency, improve stroke outcomes, decrease age-related macular degeneration, or reduce knee pain,” the journal said. “People should stop taking vitamin D supplements to prevent major diseases or extend life.”

[Read: You’re not allowed to have the best sunscreens in the world]

Australia’s new guidance is in part a recognition of this reality. It’s also the result of our improved understanding of the disparate mechanisms through which sunlight affects health. Some of them are intuitive: Bright morning light, filtered through the eyes, helps regulate our circadian rhythms, improving energy, mood, and sleep. But the systemic effects of UV light operate through entirely different pathways that have been less well understood by the public, and even many health professionals. In recent years, that science has received more attention, strengthening conviction in sunlight’s possibly irreplaceable benefits. In 2019, an international collection of researchers issued a call to arms with the headline “Insufficient Sun Exposure Has Become a Real Public Health Problem.”

Health authorities in some countries have begun to follow Australia’s lead, or at least to explore doing so. In the United Kingdom, for example, the National Health Service is reviewing the evidence on sun exposure, with a report due this summer. Dermatology conferences in Europe have begun to schedule sessions on the benefits of sun exposure after not engaging with the topic for years.

In the United States, however, there is no sign of any such reconsideration. Both the CDC and the American Academy of Dermatology still counsel strict avoidance, recommending that everyone but infants wear sunscreen every day, regardless of the weather. When I asked the AAD about Australia’s new guidelines, a spokesperson offered only that, “because ultraviolet rays from the sun can cause skin cancer, the Academy does not recommend getting vitamin D from sun exposure.”

Such a stance surely reflects understandable concerns about mixed messaging. But it also seems more and more outdated, and suggests a broader problem within American public-health institutions.

More than a century ago, scientists began to notice a mysterious pattern across the globe, which they came to call the “latitude effect.” Once you adjust for confounding variables—such as income, exercise, and smoking rates—people living at high latitudes suffer from higher rates of many diseases than people living at low or middle latitudes. The pattern plays out in many conditions, but it’s most pronounced in autoimmune disorders, especially multiple sclerosis. Throughout Europe, Australia, New Zealand, and the U.S., populations at higher latitudes are much more likely to develop MS than those closer to the equator. Over the years, scientists have offered many theories to explain this phenomenon: differences in diet, something in the water. But MS research pointed to a perhaps more obvious answer: sunlight. The higher the latitude, the lower the angle of the sun and the more its rays are filtered by the atmosphere. A number of studies have found links between sun exposure and the disease. Kids who spend less than 30 minutes a day outside on weekends and holidays are much more likely to develop MS than kids who are outside for more than one hour on these same days. Relapse rates for the disease are higher in early spring, after months of sun scarcity. People who were born in the spring (whose mothers received little sun exposure during their third trimester of pregnancy) are more likely to develop MS than people born in the fall.

Here, too, scientists first assumed that vitamin D was the key. But vitamin D supplementation proved useless for MS. Could something else about sun exposure protect against the condition?

A hint came from another disease, psoriasis, a disorder in which the immune system mistakes the patient’s own skin cells for pathogens and attacks them, producing inflammation and red, scaly skin. Since ancient times, it had been observed that sunlight seems to alleviate the condition, and doctors have long recommended “phototherapy” as a treatment. But only in the late 20th century, with the recognition that psoriasis was an autoimmune disease, did they start to understand why it worked.

It turns out that UV light essentially induces the immune system to stop attacking the skin, reducing inflammation. This is unfortunate when it comes to skin cancer—UV rays not only damage DNA, spurring the formation of cancerous cells; they also retard the immune system’s attack on those cells. But in the case of psoriasis, the tamping-down of a hyperactive response is exactly what’s needed. Moreover, to the initial surprise of researchers, this effect isn’t limited to the site of exposure. From the skin, the immune system’s regulatory cells migrate throughout the body, soothing inflammation elsewhere as well.

[Read: AI-driven dermatology could leave dark-skinned patients behind]

This effect is now believed to be the reason sun exposure helps prevent or ameliorate many autoimmune diseases, including MS, type 1 diabetes, and rheumatoid arthritis. It also explains why other conditions that involve a hyperinflammatory response, such as asthma and allergies, seem to be alleviated by sun exposure. It may even explain why some other diseases now believed to be connected to chronic inflammation, including cardiovascular disease and Alzheimer’s, are often less prevalent in regions with more sun exposure.

The consortium of Australian public-health groups had those potential benefits in mind when it drafted its new guidelines. “There’s no doubt at all that UV hitting the skin has immune effects,” Rachel Neale, a cancer researcher and the lead author of the guidelines, told me. “There’s absolutely no doubt.” But as to what to do with that knowledge, Neale isn’t certain. “This is likely to be both harmful and beneficial. We need to know more about that balance.”

What does one do with that uncertainty? The original “Slip, Slop, Slap” campaign was easy to implement because of its simplicity: Stay out of the sun; that’s all you need to know. It was, in a sense, the equivalent of the “Just Say No” campaign against drugs, launched in the U.S. around the same time. But the simplicity also sometimes runs afoul of common sense. Dermatologists who tell their patients to wear sunscreen even indoors on cloudy winter days seem out of touch.

Australia’s new advice is, by comparison, more scientific, yet also more complicated. It divides its recommendations into three groups, according to people’s skin color and susceptibility to skin cancer. Those with pale skin, or olive skin plus other risk factors, are advised to practice extreme caution: Keep slip-slop-slapping. Those with “olive or pale-brown skin” can take a balanced approach to sun exposure, using sunscreen whenever the UV index is at least a 3 (which is most days of the year in Australia). Those with dark skin need sunscreen only for extended outings in the bright sun.

[Read: The problem sunscreen poses for dark skin]

In designing the new guidelines, Neale and her colleagues tried to be faithful to the science while also realizing that whatever line is set on sun exposure, many people will cross it, intentionally or not. Even though skin cancer is rarely fatal when promptly diagnosed, it weighs heavily on the nation’s health-care system and on people’s well-being. “We spend $2 billion a year treating skin cancer in Australia,” Neale said. “It’s bonkers how much we spend, apart from the fact that people end up with bits of themselves chopped out. So at a whole-population level, the messaging will continue to be very much about sun protection.”

That said, we now know that many individuals at low risk of skin cancer could benefit from more sun exposure—and that doctors are not yet prepared to prescribe it. A survey Neale conducted in 2020 showed that the majority of patients in Australia with vitamin D deficiencies were prescribed supplements by their doctors, despite the lack of efficacy, while only a minority were prescribed sun exposure. “We definitely need to be doing some education for doctors,” she told me. In support of the new position statement, Neale’s team has been working on a website where doctors can enter information about their patients’ location, skin color, and risk factors and receive a document with targeted advice. In most cases, people can meet their needs with just a few minutes of exposure a day.

That sort of customized approach is sorely needed in the United States, Adewole Adamson, a dermatologist who directs the Melanoma and Pigmented Lesion Clinic at the University of Texas, told me. “A one-size-fits-all approach isn’t productive when it comes to sun-exposure recommendations,” he said. “It can cause harm to some populations.” For years, Adamson has called for more rational guidelines for people of color, who have the lowest risk of skin cancer and also higher rates of many of the diseases that sunlight seems to ameliorate. Adamson finds it disheartening that mostly white Australia now has “a better official position” than organizations in the U.S., “where nonwhite Americans will outnumber white Americans in the next 20 years.”

To some degree, one can sympathize with the desire to keep things simple. People have limited bandwidth, and some may misunderstand or tune out overly complicated health messages. Others will inevitably turn a little information into a dangerous thing. A fringe segment of the alt-health crowd is already suggesting that skin-cancer dangers have been exaggerated as a way to get us all to buy more sunblock. But knowing that some people will draw strange conclusions from the facts is not a good-enough reason to withhold those facts, as we saw during the pandemic, when experts looking to provide simple guidance sometimes implied that the science was more settled than it was. This is not the 1950s. When public authorities spin or simplify science in an attempt to elicit a desired behavior, they are going to get called on it. Conspiracy-minded conclusions, among other bad ones, are likely to gain more credence, not less. And the public is going to have less faith in national institutions and the positions they espouse the next time.

Besides, in this case, the news being withheld is incredibly good. It’s not every day that science discovers a free and readily accessible intervention that might improve the health of so many people. That’s the real story here, and it’s most compelling when conveyed honestly: Science feels its way forward, one hesitant step at a time, and backtracks almost as often. Eventually, that awkward but beautiful two-step leads us to better ground.


This article appears in the June 2024 print edition with the headline “Against Sunscreen Absolutism.”

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow