| 초록 |
Background: Vitamin D deficiency is associated with higher mortality risk and serum vitamin D level is known to have a seasonal variation in dialysis populations. However, whether vitamin D levels measured in specific season or at any time is associated with mortality has not studied yet. The aim of the present study was to explore the relationship between seasonal variations in vitamin D levels and mortality of hemodialysis (HD) patients.
Methods: This was a prospective observational study in single dialysis center. We analyzed 221 patients receiving HD. We observed these patients during 41 months. We measured serum 25-hydroxyvitamin D [25(OH)D] levels at the end of spring (June), the end of summer (September), the end of autumn (December), and the end of winter (March). We explored whether seasonal differences in 25(OH)D levels were associated with mortality of HD patients.
Results: The mean age was 56.3 ± 13.4 years and the portion of male was 51.1 %. The level of 25(OH)D was highest in the summer (13.0 ± 5.9 ng/mL) and lowest in the winter (9.7 ± 4.9 ng/mL; p<0.001). Upon univariate analysis, serum 25(OH)D in winter (HR, 0.884;95% CI, 0.786-0.994) and summer (HR, 0.921; 95% CI, 0.852-0.996) was associated with increased all-cause mortality rate. After adjustment for confounding factors, winter 25(OH)D level was significantly associated with all-cause mortality (HR, 0.866; 95% CI, 0.758-0.990).
Conclusion: Winter serum vitamin D levels may predict mortality and monitor deficiency of vitamin D in HD patients when seasonal variation in vitamin D was considered in countries with the four distinct seasons typical of temperate climates. |