Medpage Today. March 2013
In NHS, which began in 1976, women (ages 30 to 55) living in states with the highest ultraviolet B (UVB) intensity had a 21% lower risk for RA compared with those living in states with low UVB levels (hazard ratio 0.79, 95% CI 0.66 to 0.94, P=0.005 for trend), according to Elizabeth Arkema, PhD, and colleagues from Harvard University.
But in NHSII, initiated in 1989 in women ages 25 to 42, no significantly lower risk was seen (HR 1.12, 95% CI 0.87 to 1.44, P=0.37 for trend), the researchers reported online in Annals of the Rheumatic Diseases.
"The later birth cohort of NHSII participants (born between 1946 and 1964) were more likely aware of the dangers of sun exposure and, perhaps, had more sun-protective behavior, making residential UVB not as good a proxy for actual sun exposure in NHSII," they suggested.
Epidemiologic studies have found a correlation between an increased incidence of RA and other autoimmune diseases with higher latitude of residence.
In addition, experimental studies have demonstrated immunosuppressive effects of UVB, such as through influences on T-cells and cytokines.
Exposure to UVB also increases vitamin D synthesis in the skin, which, in turn, has immunomodulatory properties.
To examine a possible role for sunlight exposure on RA risks, Arkema and colleagues averaged cumulative UVB flux data for 106,368 women in NHS and 115,561 women in NHSII according to the state in which they lived.
UVB flux is a measure that reflects exposure intensity based on altitude, latitude, and typical cloud cover patterns, and is expressed in Robertson-Berger units.
This measure shows considerable variability in the U.S., ranging from 196 R-B units in sunny states such as Arizona and Hawaii to only 93 units in Oregon and Alaska.
Information on residence, health, diet, and lifestyle was acquired every 2 years from participants in both cohorts.
A total of 933 women in NHS were diagnosed with RA, as were 381 in NHSII.
The mean age at the time of RA diagnosis was earlier in NHSII, at 47 years, compared with 58.7 years in NHS.
The absolute risk over 20 years for RA in NHS among women in the states with the highest levels of UVB was 0.7% compared with 1.2% in those living in the lowest levels, for a risk difference of 0.5% (95% CI 0.2 to 0.8).
In contrast, the absolute risk was 0.5% for both highest and lowest levels in NHSII, giving a risk difference of 0% (95% CI −0.2 to 0.2).
Similar findings of decreased risk for high exposure in NHS though not in NHSII were seen both for exposure levels at birth and at age 15.
It thus remains unclear if the important window for UVB exposure is in childhood or adulthood.
Further analyses found no significant heterogeneity according to skin type, vitamin D intake, or physical activity and body mass index.
There also was no greater effect for the highest level of cumulative UVB exposure, above 164 R-B units, or the highest level of vitamin D intake, at 400 IU per day or more.
And while NHSII was a younger cohort, when risks were calculated for women in that cohort ages 52 and above, the hazard ratio was similar to that seen in NHS.
Additional adjustment for factors such as regional socioeconomic factors and access to care had little effect on risk estimates, but the researchers acknowledged that there may have been additional environmental or behavioral factors that could influence their results.
These findings add to the increasing evidence that more intense sun exposure lowers the risk of RA, the researchers stated.
"The mechanisms are not yet understood, but could be mediated by cutaneous production of vitamin D and attenuated by use of sunscreen or sun avoidant behavior," Arkema and colleagues wrote.
They called for additional research to explore UVB dose intensity and timing of exposure.
The study was supported by the NIH and a Nutritional Epidemiology of Cancer Training grant.
Primary source: Annals of the Rheumatic Diseases