Psycho-Babble Medication | about biological treatments | Framed
This thread | Show all | Post follow-up | Start new thread | List of forums | Search | FAQ

The Vitamin D Dilemma - an Essay by L. Hoover

Posted by Garnet71 on March 13, 2009, at 17:28:16

(Reposted from above)

To properly answer the question about the RDA for Vitamin D, I think we have to look at what is meant by RDA in the first place. I'm going to start from first principles, then provide some history of nutrition policy, proceed to some recent examples that call such policy into question, and then later return to vitamin D specifically.

The definition of RDA has changed substantially over the sixty years or so since it was first proposed. It was originally a military concept, looking at the basic nutrition needed to keep the troops on the battlefields during WW ll from dropping from malnutrition rather than enemy action. That was called the Minimum Daily Requirement. Soon after, the concept was broadened and two new definitions were put into use, and they were EAR (Estimated Average Requirement) and RDA (Recommended Daily Allowance). Both were defined in terms of clinical deficiency. I've lost my exact definitions, so I can't quote them precisely, but here is my best attempt at reproducing them. EAR is that level of nutrient intake at which 50% of normal healthy people exhibit overt deficiency symptoms. RDA is that level of nutrient intake which will prevent overt deficiency symptoms in normal healthy people 97.5% of the time. In all my years of research in this field, I have never seen a definition of what was meant by normal healthy people, but even they're not all protected by this intake level. And what about everyone else?

Over the years, the definitions have been re-written, and totally altered. The nutrient intakes they refer to haven't changed (much), the people they refer to haven't changed much (they've adjusted for age and gender now), but the message (deficiency) is now totally obscured. The most recent definitions I've found are: Estimated Average Requirements (EAR), that intake expected to satisfy the needs of 50% of the people in that age group; and, Recommended Dietary Allowances (RDA), the daily dietary intake level of a nutrient considered sufficient to meet the requirements of nearly all (9798%) healthy individuals in each life-stage and gender group. Satisfy needs? Meet requirements? I consider the changes to be political, and I'll show you why, soon enough.

Let's look at a simple graph of some theoretical nutrient at varying intake levels, and the plots of EAR and RDA. http://jn.nutrition.org/cgi/content/full/131/2/361S/F6 If you want the graph enlarged, just click on it. The y-axis is risk of inadequacy. Notice that it doesn't go to 100%. That's because this is actually only a segment of a more complex graph that we'll consider in a moment. EAR is at 50% inadequacy. RDA is not at zero inadequacy. The x-axis is increasing intake, from left to right. Way over to the right is the plot of risk of excess, and a new concept, UL, to the left of it. You might see UL (Upper Limit), or you might see TI (Tolerable Intake), in the literature, but they both refer to the same concept. The Tolerable Upper Limit of Intake is that level of chronic intake at which even the most sensitive of individuals are unlikely to experience any measurable adverse effects. Do you notice how the adverse effects of high intakes are conceptualized and treated differently than the adverse effects of low intakes? And, where is the optimal intake level? It must be above RDA, right? Logically, it seems it must be so. But before this most recent decade, optimal intake was not even considered to be a part of the formal process of nutritional assessment. Even now, it's only creeping in. More on that soon.

Let's look at a more complete conceptualization of the effect of varying the intake of some theoretical nutrient. http://jn.nutrition.org/content/vol133/issue5/images/large/1563sf02.jpeg This is the larger image, as the small insert was not legible.

The y-axis on the left is in percent of the population, and the x-axis along the bottom goes from zero intake increasing progressively to the right. There are eight lines plotted, and we'll consider them from left to right. The first one is labeled death. At zero intake, 100% are dead. But at some slight increase from zero intake, some people survive. As intake increases, more and more people survive, until at some point everybody is alive. Not well, but alive.

Line 2 is clinical effects (overt deficiency). This is the line that was originally used to define EAR and RDA, and is the one represented in the first graph. Now this is a little tricky to grasp, but if you follow line 1 (death) down to near zero incidence, you're at an intake at which line 2 is already dropping from 100% incidence. At the same intake level at which some people are dying, there are some people no longer showing overt deficiency symptoms. Although this is a theoretical representation, it is actually true in that regard. We're not all the same. Individual needs vary substantially.

Line 3 is labeled subclinical biomarkers of functional impairment. We're starting to see some of this creeping into medical-think, but by and large, it's not official. This might be a blood test that shows that there's still room for improvement, for example, but the overt deficiency syndrome is not clinically evident. This is what we're talking about with vitamin D when we say the RDA is inadequate, but we'll come back to that later.

Line 4 is labeled bioclinical markers without functional significance. It's in dashes because they aren't really giving it much credence, even as a theoretical construct, but I would label this curve as
bioclinical markers with unknown functional significance. Just because we might not yet know why something is changing doesn't mean it doesn't matter.

Line 5 is also bioclinical markers without functional significance, but it's framed in terms of toxicity. And so, on for lines 6 through 8, with the definitions mirroring those found on the left, toxicity versus deficiency.

And here's the point of this exercise, the range of intakes lying between the zero incidence points for line 3 subclinical biomarkers of functional impairment by deficiency, and line 6 subclinical biomarkers of functional impairment by toxicity, defines the range of optimal intake of that nutrient. That range has been defined as AROI (Acceptable Range of Oral Intake). Please note that RDA, part way up line 2, lies significantly to the left (deficiency) of the lowest intake defined by AROI.

This argument is not new. It was discussed in an essay by Abram Hoffer some years ago. Here's a link to that essay: http://www.internetwks.com/pauling/hoffer.html

Ya, so what. Right? A balanced diet meets everyone's nutritional requirements, so what's the big deal? That is a myth. It is flat out false. There is no such thing as this balanced diet that meets nutritional requirements.

Some years ago, after I had already had some success in treating my own symptoms with nutritional supplements, I sought out explicit descriptions of what constituted a balanced diet. Forget the food pyramid. That's totally bogus, and I don't want to get off on that tangent. Anyway, I wanted to know what a balanced diet really was, so I could compare it to my own. I couldn't find a description anywhere. So I tried to figure my own out. At the time, the USDA had the most amazingly detailed nutrient analyses online. Virtually any food you could think of was totally broken down to describe everything it contained. You could search on a nutrient, and rank order the results so that you could see what contained the most of any nutrient, and so on. But that got hugely complicated, so I went back to searching the net. I found a university nutrition program that had one of their assignments online. That was their assignment, to use the USDA database to design a balanced diet, which was defined as meeting the RDA for the then 17 nutrients with that defined threshold of intake, while simultaneously meeting reasonable caloric intake limits. The take-home message from this exercise, from a very enlightened member of the nutrition establishment, was that it was not possible. The point of the exercise was to prove that it could not be done.

I then posted to the newsgroup sci.med.nutrition, and challenged anyone to show me a diet that met the RDA within caloric targets. One poster said such a task was easy-peasy, and proceeded to download the entire USDA database onto a mainframe computer, and started crunching the data. The bizarre output of this analysis yielded but one diet, which included about 25 pounds of lettuce per day (to meet the folate RDA), and some other absurd things that just don't come to mind. I don't think you could eat 25 pounds of lettuce in a day, even if you started from the moment you awoke. And I could not imagine the digestive impact would be without consequences. Or that I'd want to do that day after day. I'm left to conclude that it can't be done, but I welcome anyone showing me otherwise. I don't want to be right. I want to get it right.

Interestingly enough, within a year of my online challenge, the USDA website was totally revamped, and the detailed nutrient data is no longer available. I don't know if that's a coincidence or not.

Setting aside whether the balanced diet is a myth or not, what are people really consuming? What nutrients are they getting? The bulk of the data is U.S., so unless I say otherwise, we're talking about Americans. There are a few ways of determining nutrient intake, but they typically rely on the USDA database for actual nutrient content of specific foods. Here are some intake summaries (table format) that I pulled from nutrition journals. If you want to read the full-text paper that I pulled these tables from, just knock off the /T4 or whatever at the end of the URL.

One method is to ask people what they ate over a three day period, and estimate their nutrient intake from that. This survey included nearly 12,000 people. Here's a table with vitamin A, C, and folate intake, stratified by gender and age: http://jn.nutrition.org/cgi/content/full/131/8/2177/T4 At the 50th percentile, half the people took in less, and half the people took in more. If the table value isn't 100% or more, then they were deficient, using RDA as the threshold.

Here's another table from the same paper, for iron and zinc. http://jn.nutrition.org/cgi/content/full/131/8/2177/T5

This graph of zinc intake is also from a food intake survey from about 30,000 people, but it also included any supplements taken. But it is presented in terms of an arbitrarily defined concept called Adequate Intake, which is only 77% of the RDA.
http://jn.nutrition.org/cgi/content/full/130/5/1367S/T4

Before I move away from zinc as a topic, I want to draw attention to something I'm going to address over and over again in this document. The hazards of high doses are given great weight in deliberations about safe exposure, whereas the hazards of insufficient doses are treated quite differently. For zinc, the toxic exposure risk threshold for infants and adolescents exceeds the RDA intake level. What they do is they expose e.g. rodents to large doses of a substance over a period of time, and they then determine what are known as the LOAEL (Lowest Observable Adverse Effects Level) and NOAEL (No Observable Adverse Effects Level). The LOAEL gives a threshold of concern, while NOAEL gives a margin of safety. They then convert the NOAEL exposure into a body weight exposure, and they also use a variety of safety factors to account for the fact the study wasn't done in humans. Those safety factors can be as much as 1000-fold. In other words, they divide the NOAEL exposure on a mg/kg body weight basis by 1000, to give a safe chronic human exposure level. If you do that with zinc, young people should not ever be exposed to even so much as the RDA, which is obviously quite absurd.

Here's a recent dietary analysis that shows that even multi-vitamin users do not all achieve AI (Adequate Intake) for any identified nutrient.
http://www.ajcn.org/cgi/content/full/85/1/280S/T1

I believe that the most common deficiency in the U.S. is in magnesium intake....less than 22% of Americans get the RDA
http://www.jacn.org/cgi/content/full/24/3/166/T2

When I take all of this data together, and combine it with the definitions themselves, I am absolutely convinced that the balanced diet meeting nutritional requirements is a myth. Some people can do well on it, nonetheless, and I fear it is those people preaching to all the rest of us. Kind of like, If I can do it, you can, too. If you can't do it, there's something wrong with you. Well, maybe there is something wrong with me. Maybe I'm just more sensitive to nutrient intake levels that you are. Than most of us are, for that matter.

But again, so what? Are people dying of nutrient deficiencies in the wealthiest country on Earth? Maybe. Bruce Ames seems to think so. And he's not the only one. I'm going to use numbered references from this point forward. You'll find the abstracts at the end of this essay. Look at (1), (2), (3), and (4).

(Aside: Somebody recently posted an article describing a study of self-selected post-menopausal women and the effect of multivitamins on cancer and cardiac mortality. Maybe the group they looked at aren't representative of all women, let alone humans. And maybe the dose they took was too small to have any effect. And maybe the dose they took was too late, in lifespan terms, to influence the outcome significantly over the brief observation period. And maybe those women who were most vulnerable had already died, or didn't have stable homes, or didn't have time to participate. And non-rejection of the null hypothesis (that there was no real difference to be found) is not evidence that there was no real difference. It simply can't be interpreted; absence of evidence is not evidence of absence.)

Are nutrient deficiencies being identified elsewhere in the world? Yes, they are, but they are also based on concepts defined in the same way as RDA, in terms of clinical deficiency symptoms. But worse yet, in these examples, they use EAR (50th percentile of deficiency intake). Look at (5) and (6). Ear isn't a very high threshold, yet substantial numbers fail to meet even that amount.

And, back to politics. I think they don't want you to know that the science actually shows how bad our diets are. They don't want you to know that they define adequate intake solely in terms of clinical deficiency, but not even enough to include all people under that protection. They don't want you to know that those are intake levels defined for normal healthy people. I think I hear a lot of sheep, too.

Boy, that Lar can just go off on tangents, can't he? Weren't we supposed to be talking about vitamin D?

Okay, here it comes. I thought it was important to lay some groundwork, so that when I refer to the basic concepts, they can be more meaningfully applied to vitamin D.

Vitamin D can be consumed in foods in two common forms, ergocalciferol (D2), and cholecalciferol (D3). Both are pro-hormones, which is to say that they are not biologically active. They must each be converted to active forms, and it is control of those conversion rates that controls the biological activity within the body. Vitamin D2 is plant-derived, and is not very commonly found in high concentration in our food supply. In supplement form, it has been shown to be less effective than D3 (ref. 7), so this is the last I will mention it. The form that is synthesized in our skin when exposed to UV light is D3. In winter, anyone at greater latitudes than 45 degrees cannot synthesize D3, as the atmosphere absorbs the UV-B wavelengths that can trigger the synthesis reaction. A simple guideline is that you need a UV Index of at least 3 for there to be any chance of sufficient UV exposure to produce vitamin D3 in your skin. The other variable is obviously the amount of clothing you wear in winter. Vitamin D3 is also the form found in fish and dairy, which represent the dominant dietary sources for this nutrient. Some non-fish, non-dairy foods are fortified, however, so there are other sources, but those fortification programs vary by jurisdiction, so it's hard to generalize.

Vitamin D is fat-soluble, so it is stored throughout the body. Reserves can last for months, which is how people in the higher latitudes make it through the winter. I would suggest that if this was not the case, humans would only exist near the equator.

Vitamin D3 is, as I mentioned, a pro-hormone. It needs to be chemically modified before it takes on its hormone activity. This is a two-step process. The liver is the primary site for the first step, which converts D3 into 25-hydroxyvitamin-D3, which is also called 25-hydroxycholecalciferol, or sometimes calcidiol or just 25(OH)D3. This is considered to be the substance that it is important to measure when a vitamin D blood level is taken. The next conversion is to 1,25-dihydroxyvitamin-D3, which is also known as 1,25-dihydroxycholecalciferol, and calcitriol or 1,25(OH)2D3. You may find any of these terms in the literature, or on a lab report, but you need to know which substance is being discussed to make any sense of it.

It is the blood concentration of 25(OH)D3 that is considered to be the critical value to determine your vitamin D status. In different parts of the world, the minimum cut-off for acceptable 25(OH)D3 concentration is 75 nmol/L, or 30 ng/mL. There isn't one global standard, unfortunately. And, it is not possible to obtain that minimum threshold with intakes at the RDA. (I bet you were wondering when the heck I was going to bring that back around full circle.) I've seen some people mentioning their own lab results, but it's really important to give the units in which they're measured. 1 ng/mL is equal to 2.5 nmol/L, so it matters a great deal which one applies to your test results.

What we have right now is a strange disconnect between the official RDA for vitamin D, and the accepted blood concentrations of the half-activated form of the vitamin. The last revision for the RDA, in the U.S., occurred in 1997, when it was set at 200 IU/day for most adults. If you're older than 50, it's 400 IU, and over 71, it's 600 IU. The most recent population survey data indicate that depending on the population group (age/gender), 1-9% had serum 25(OH)D levels <11 ng/mL (<27.5 nmol/L), 8-36% had levels <20 ng/mL (<50 nmol/L), and the majority (50-78%) had levels <30 ng/mL (<75 nmol/L). Notwithstanding the official RDAs, in 2005, the National Institutes of Health (who set the RDA values) released Dietary Guidelines for Americans which indicated that the obese, people with fat malabsorption, the dark-skinned, older adults, people with limited sun exposure, and breast fed infants, ought to take supplements of 1000 IU/day (400 IU for the infants). Here's the full document: http://ods.od.nih.gov/factsheets/vitamind.asp

I want to take you back to that graph with the eight lines on it. If you will recall, the line with the RDA on it was line 2. It is defined in terms of clinical deficiency. For vitamin D, that would be rickets and osteomalacia, I have to presume. The document I just mentioned does discuss other health end-points as potential measures of deficiency, yet it does not adjust the deficiency threshold in light of those findings (that would be line 3). I'll cover some of those details in a moment, but I wanted to point out how much resistance there is to take into account other markers of sufficiency than simply avoiding rickets or osteomalacia. I'm going to quote from this document:

Laboratory and animal evidence as well as epidemiologic data suggest that vitamin D status could affect cancer risk. Strong biological and mechanistic bases indicate that vitamin D plays a role in the prevention of colon, prostate, and breast cancers..... Further research is needed to determine whether vitamin D inadequacy in particular increases cancer risk, whether greater exposure to the nutrient is protective, and whether some individuals could be at increased risk of cancer because of vitamin D exposure.
A growing body of research suggests that vitamin D might play some role in the prevention and treatment of type 1 [65] and type 2 diabetes [66], hypertension [67], glucose intolerance [68], multiple sclerosis [69], and other medical conditions [70-71]. However, most evidence for these roles comes from in vitro, animal, and epidemiological studies, not the randomized clinical trials considered to be more definitive. Until such trials are conducted, the implications of the available evidence for public health and patient care will be debated. Although vitamin D supplements above recommended levels given in clinical trials have not shown harm, most trials were not adequately designed to assess harm [6].....A recent meta-analysis found that use of vitamin D supplements was associated with a reduction in overall mortality from any cause by a statistically significant 7%.

It boggles my mind how much proof is required to move these people from their pre-determined ideology. Dogma. I hear a dogma barking.

I've just raised the broader issue of other health benefits accruing from vitamin D, apart from achieving healthy bone. How does this take place? Well, it turns out that the classic view that the kidney is the sole site of the final activation of 25(OH)D to 1,25(OH)2D is quite false. Rather than the kidney being the endocrine gland distributing active vitamin D to the rest of the body, that only accounts for about 20% of vitamin D activity. ( I had a reference for that, but I failed to copy it. Ooops.) The rest is paracrine, autocrine, and intracrine activity. Of course, I'll define these terms for you.

An endocrine gland secretes a hormone that acts throughout the body. A paracrine gland (or cell) secretes a hormone that is active nearby. Within an organ or tissue compartment, for example. An autocrine gland or cell secretes a hormone that activates its own surface receptors. As you can imagine, paracrine and autocrine activity can overlap. And finally, intracrine activation occurs internally. There are vitamin D receptors (VDRs) in the cell nucleus, which regulate transcription. The dominant effect of kidney-derived 1,25(OH)2D is to regulate intestinal absorption of calcium and magnesium, and to indirectly modulate the parathyroid gland. It turns out that e.g. bone has its own capacity to produce 1,25(OH)2D from 25(OH)D (ref 8). And when they started looking around for the RNA for the specific enzyme involved, they found it in many different tissues. So, there may be no tissue in the body that doesn't have some self-regulatory capacity with respect to vitamin D.

So, what other health parameters are influenced by 25(OH)D levels? Mood was the one that started this thread. In women, low levels were associated with higher levels of premenstrual syndrome, seasonal affective disorder, non-specific mood disorder, and major depressive disorder (ref 9). In a cross-sectional sample of older subjects, half with Alzheimer's and half nondemented, vitamin D deficiency was associated with low mood and with impairment on two of four measures of cognitive performance (ref 10). Cognitive function and mood were also shown to be dependent in a complex interaction with parathyroid function and low vitamin D (ref 11).

What about other disorders? One paper mentions finding low vitamin D status in tuberculosis, rheumatoid arthritis, multiple sclerosis, inflammatory bowel diseases, hypertension, and specific types of cancer., and that intervention trials have shown decreases in blood pressure, improved insulin levels in diabetics, and symptomatic relief in rheumatoid arthritis and multiple sclerosis. (ref 12) In the context of specific cancers, colorectal cancer incidence can be significantly reduced by a daily dose of 1000-2000 IU. (ref 13) In Norway, the season of diagnosis of lung cancer has a significant effect on survival, which they relate to UV-light exposure. (ref 14) And the autocrine 1,25(OH)2D capacity of prostate cells suggests that vitamin D supplementation might help prevent cancer in this gland. (ref 15) So far, the powers that be do not consider these health outcomes to be requirements needing to be met through vitamin D intake, expressed as RDA.

So, if the RDA is grossly inadequate, how much should one take? How much can one take?

Way back, I mentioned that a blood concentration of 75 nmol/L, or 30 ng/mL was the minimum for optimal function. Over a six-month period, the average intake required to meet that threshold was 3440 IU/day. (ref 16) Using other primates as comparators, and in the context of the absence of an effect threshold in humans, this researcher argues that 15 mcg per day do not meet the criterion of an RDA. (ref 17) This latter one was a Canadian paper, and 15 mcg/day is 600 IU/day, but that's still a very direct conclusion. The same author, in another paper, argues that the tolerable upper intake level should be at least 10,000 IU/day (it's currently set at 2,000). (ref 18)

Solely because it indicates that there are people who go beyond, and not that I am in any way recommending such conduct, but an anecdotal report of a man who took an increasing dose reaching 88,000 IU/day over four years had some biochemical parameters start to slip outside the normal range (that's line 5 on the AROI graph), but they normalized over a few months after he discontinued. (ref 19)

So, how can I bring this all together? I believe the definition of RDA is based on an inappropriate premise, that avoiding overt deficiency symptoms is all that a nutrient might have to offer. I believe that a balanced diet cannot provide the nutrients defined even by the RDAs. I believe that malnutrition is rampant. I believe that the RDA for vitamin D is grossly inadequate, and that supplementing intake to approach 4000 IU/day total intake can be done without risk of adverse effects.

Lar

References:

(1) Arch Biochem Biophys. 2004 Mar 1;423(1):227-34.
A role for supplements in optimizing health: the metabolic tune-up.
Ames BN.
Children's Hospital Oakland Research Institute, CA 94609, USA. bruceames@chori.org

An optimum intake of micronutrients and metabolites, which varies with age and genetic constitution, would tune up metabolism and give a marked increase in health, particularly for the poor, young, obese, and elderly, at little cost. (1) DNA damage. Deficiency of vitamins B-12, folic acid, B-6, C or E, or iron or zinc appears to mimic radiation in damaging DNA by causing single- and double-strand breaks, oxidative lesions or both. Half of the population may be deficient in at least one of these micronutrients. (2) The Km concept. Approximately 50 different human genetic diseases that are due to a poorer binding affinity (Km) of the mutant enzyme for its coenzyme can be remedied by feeding high-dose B vitamins, which raise levels of the corresponding coenzyme. Many polymorphisms also result in a lowered affinity of enzyme for coenzyme. (3) Mitochondrial oxidative decay. This decay, which is a major contributor to aging, can be ameliorated by feeding old rats the normal mitochondrial metabolites acetyl carnitine and lipoic acid at high levels. Many common micronutrient deficiencies, such as iron or biotin, cause mitochondrial decay with oxidant leakage leading to accelerated aging and neural decay.

(2) J Nutr. 2003 Aug;133(8):2543-8.
Zinc deficiency induces oxidative DNA damage and increases p53 expression in human lung fibroblasts.
Ho E, Courtemanche C, Ames BN.
University of California, Berkeley, CA 94720, USA.

Poor zinc nutrition may be an important risk factor in oxidant release and the development of DNA damage and cancer. Approximately 10% of the United States population ingests <50% of the recommended daily allowance for zinc, a cofactor in proteins involved in antioxidant defenses, electron transport, DNA repair and p53 protein expression. This study examined the effects of zinc deficiency on oxidative stress, DNA damage and the expression of DNA repair enzymes in primary human lung fibroblasts by the use of DNA microarrays and functional assays. Cellular zinc was depleted by 1) growing cells in a zinc-deficient medium and 2) exposuring cells to an intracellular zinc chelator, N,N,N',N'-tetrakis-(2-pyridylmethyl)ethylenediamine. Array data revealed upregulation of genes involved in oxidative stress and DNA damage/repair and downregulation of other DNA repair genes. Zinc deficiency in cells caused an increase in oxidant production (dichlorofluoroscein fluorescence) and a significant induction of single-strand breaks (Comet assay) and p53 protein expression (Western blot analysis). Thus, zinc deficiency not only caused oxidative stress and DNA damage, but also compromised the cells' ability to repair this damage. Zinc adequacy appears to be necessary for maintaining DNA integrity and may be important in the prevention of DNA damage and cancer.

(3) Toxicol Lett. 1998 Dec 28;102-103:5-18.
Micronutrients prevent cancer and delay aging.
Ames BN.
University of California, Berkeley 94720-3202, USA. bnames@uclink4.berkeley.edu

Approximately 40 micronutrients are required in the human diet. Deficiency of vitamins B12, folic acid, B6, niacin, C, or E, or iron, or zinc, appears to mimic radiation in damaging DNA by causing single- and double-strand breaks, oxidative lesions, or both. The percentage of the US population that has a low intake (< 50% of the RDA) for each of these eight micronutrients ranges from 2% to > or = 20%; half of the population may be deficient in at least one of these micronutrients. Folate deficiency occurs in approximately 10% of the US population, and in a much higher percentage of the poor. Folate deficiency causes extensive incorporation of uracil into human DNA (4 million/cell), leading to chromosomal breaks. This mechanism is the likely cause of the increased cancer risk, and perhaps the cognitive defects associated with low folate intake. Some evidence, and mechanistic considerations, suggest that vitamin B12 and B6 deficiencies also cause high uracil and chromosome breaks. Micronutrient deficiency may explain, in good part, why the quarter of the population that eats the fewest fruits and vegetables (five portions a day is advised) has approximately double the cancer rate for most types of cancer when compared to the quarter with the highest intake. Eighty percent of American children and adolescents and 68% of adults do not eat five portions a day. Common micronutrient deficiencies are likely to damage DNA by the same mechanism as radiation and many chemicals, appear to be orders of magnitude more important, and should be compared for perspective. Remedying micronutrient deficiencies is likely to lead to a major improvement in health and an increase in longevity at low cost. Aging appears to be due, in good part, to the oxidants produced by mitochondria as by-products of normal metabolism. In old rats mitochondrial membrane potential, cardiolipin levels, respiratory control ratio, and overall cellular O2 consumption are lower than in young rats, and the level of oxidants (per unit O2) is higher. The level of mutagenic aldehydes from lipid peroxidation is also increased. Ambulatory activity declines markedly in old rats. Feeding old rats the normal mitochondrial metabolites acetyl carnitine and lipoic acid for a few weeks, restores mitochondrial function, lowers oxidants to the level of a young rat, and increases ambulatory activity. Thus, these two metabolites can be considered necessary for health in old age and are therefore conditional micronutrients. This restoration suggests a plausible mechanism: with age-increased oxidative damage to proteins and lipid membranes causes a deformation of structure of key enzymes, with a consequent lessening of affinity (Km) for the enzyme substrate; an increased level of the substrate restores the velocity of the reaction, and thus restores function.

(4) Med Hypotheses. 1999 May;52(5):417-22.
Toward a new definition of essential nutrients: is it now time for a third 'vitamin' paradigm?
Challem JJ.
Aloha, Oregon 97006, USA. 74543.2122@compuserve.com

The concepts of vitamin 'deficiency' diseases and the recommended dietary allowances (RDAs) have not kept pace with the growing understanding of the cellular and molecular functions of vitamins and other micronutrients. As a consequence, many researchers and clinicians rely on outdated signs and symptoms in assessing nutritional deficiencies. A new paradigm, presented here, proposes that: (1) deficiencies can be identified on biochemical and molecular levels long before they become clinically visible; (2) the definition of essential micronutrients be broadened to include some carotenoids and flavonoids, as well as various human metabolites, such as coenzyme Q10, carnitine, and alpha-lipoic acid, which are also dietary constituents; (4) individual nutritional requirements are partly fixed by genetics but also dynamically influenced by variations in the body's biochemical milieu and external stresses; and (5) the distinction between nutritional and pharmacological doses of vitamins is meaningless, since high doses of micronutrients may be required to achieve normal metabolic processes in some people.

(5) Int J Vitam Nutr Res. 2006 Nov;76(6):343-51.
Vitamin and mineral inadequacy in the French population: estimation and application for the optimization of food fortification.
Touvier M, Lioret S, Vanrullen I, Boclé JC, Boutron-Ruault MC, Berta JL, Volatier JL.
AFSSA (French Food Safety Agency), DERNS, 27-31 Avenue du Général Leclerc, 94701 Maisons-Alfort, France. m.touvier@afssa.fr

The objective was to assess the prevalence of inadequate micronutrient intake in a representative sample of the French population, which to our knowledge, had never been done before, and to use this concept to optimize efficiency and safety of food fortification. The INCA survey collected food intake data using a 7-day record, for 2373 subjects (4-92 years). The prevalence of inadequacy for calcium, magnesium, iron, vitamins C, A, B6, and B12, thiamin, riboflavin, niacin, pantothenic acid, and folate was estimated by the proportion of subjects with intake below the Estimated Average Requirement (EAR). We also calculated daily consumption of 44 food groups by age and gender. This paper shows how the combination of both data sets, i.e., inadequacy and food consumption data, allows a preliminary screening of potential food vehicles in order to optimize fortification. The prevalence of inadequacy was particularly high for the following groups: for calcium, females aged 10-19 years (73.5%) or aged 55-90 years (67.8%), and males aged 15-19 years (62.4%) or aged 65-92 years (65.4%); for magnesium, males aged 15-92 years (71.7%) and females aged 10-90 years (82.5%); for iron, females aged 15-54 years (71.1%); and for vitamin C, females aged 15-54 years (66.2%). Two examples are provided to illustrate the proposed method for the optimization of fortification.

(6) Can J Diet Pract Res. 2007 Spring;68(1):23-9.
Widespread micronutrient inadequacies among adults in Prince Edward Island.
Taylor JP, Maclellan DL, van Til L, Sweet L.
Department of Family and Nutritional Sciences, University of Prince Edward Island, Charlottetown, PE.

PURPOSE: The prevalence of micronutrient inadequacies was assessed among adult residents of Prince Edward Island (PEI) in the PEI Nutrition Survey. METHODS: A peer-reviewed protocol was used in this cross-sectional survey, in which 24-hour recalls were administered during in-home interviews. A stratified random sample of 1,995 adults aged 18 to 74 participated. Median nutrient intakes with and without supplements were calculated; intakes were adjusted for day-to-day variability. Chi-square testing was used to assess differences in prevalence of inadequacy by age and sex. RESULTS: Most of the sample (more than 90%) had folate intakes below the Estimated Average Requirement (EAR). Magnesium and vitamin C intakes were low in more than 50% of the sample. Iron intakes were adequate in all groups except women aged 19 to 50, 29% of whom had intakes below the EAR. Women were more likely than men to have inadequate intakes. Median calcium intakes fell below recommendations for all age and sex groups. Supplement use had little impact on dietary adequacy in this sample. CONCLUSIONS: This study underscores the need for public health interventions designed to reduce the very high prevalence of nutrient inadequacies in the PEI adult population. In addition, education is needed on the selection of appropriate vitamin and mineral supplements.

(7) J Clin Endocrinol Metab. 2004 Nov;89(11):5387-91.
Vitamin D2 is much less effective than vitamin D3 in humans.
Armas LA, Hollis BW, Heaney RP.
Creighton University, 601 North 30th Street, Suite 4841, Omaha, Nebraska 68131, USA.

Vitamins D(2) and D(3) are generally considered to be equivalent in humans. Nevertheless, physicians commonly report equivocal responses to seemingly large doses of the only high-dose calciferol (vitamin D(2)) available in the U.S. market.The relative potencies of vitamins D(2) and D(3) were evaluated by administering single doses of 50,000 IU of the respective calciferols to 20 healthy male volunteers, following the time course of serum vitamin D and 25-hydroxyvitamin D (25OHD) over a period of 28 d and measuring the area under the curve of the rise in 25OHD above baseline.The two calciferols produced similar rises in serum concentration of the administered vitamin, indicating equivalent absorption. Both produced similar initial rises in serum 25OHD over the first 3 d, but 25OHD continued to rise in the D(3)-treated subjects, peaking at 14 d, whereas serum 25OHD fell rapidly in the D(2)-treated subjects and was not different from baseline at 14 d. Area under the curve (AUC) to d 28 was 60.2 ng.d/ml (150.5 nmol.d/liter) for vitamin D(2) and 204.7 (511.8) for vitamin D(3) (P < 0.002). Calculated AUC(infinity) indicated an even greater differential, with the relative potencies for D(3):D(2) being 9.5:1.Vitamin D(2) potency is less than one third that of vitamin D(3). Physicians resorting to use of vitamin D(2) should be aware of its markedly lower potency and shorter duration of action relative to vitamin D(3).

(8) Clin Biochem Rev. 2003;24(1):13-26.
Vitamin d metabolism: new concepts and clinical implications.
Anderson P, May B, Morris H.
Hanson Institute, Adelaide, SA 5000 and School of Molecular and Biomedical Science, University of Adelaide, South Australia 5005.

The vitamin D endocrine system plays a primary role in the maintenance of calcium homeostasis as well as exerting a wider range of biological activities including the regulation of cellular differentiation and proliferation, immunity, and reproduction. Most of these latter activities have been demonstrated using in vitro techniques. A major issue is to place such in vitro findings into their physiological context. Vitamin D exerts its genomic effects through a nuclear gene transcription factor, the vitamin D receptor (VDR), while metabolism of vitamin D both to its biologically active form, as well as to its excretory product, plays a major role in determining biological activity at the tissue level. Considerable information has become available recently concerning the metabolism of vitamin D both in the kidney and in non-renal tissues. These data confirm the endocrine action of vitamin D through renal metabolism which provides 1,25 dihydroxyvitamin D (1,25D) to the circulation. The major organ responding to the endocrine action of 1,25D is the intestine where it controls absorption of calcium and phosphate. Preliminary information regarding the contribution of tissue-specific production of 1,25D to its paracrine/autocrine activity is now becoming available. In bone cells, these data provide evidence for the modulation of cell proliferation and stimulation of bone cell maturation. The relevance of these concepts to the clinical laboratory is discussed in the context of vitamin D insufficiency and the increased risk of hip fracture amongst the elderly.

(9) J Midwifery Womens Health. 2008 Sep-Oct;53(5):440-6.
Vitamin D and mood disorders among women: an integrative review.
Murphy PK, Wagner CL.
Medical University of South Carolina, 169 Ashley Ave., P.O. Box 250347, Charleston, SC 29425, USA. murphypa@musc.edu

This integrative review evaluates research studies that investigated the association between vitamin D and mood disorders affecting women to determine whether further research comparing these variables is warranted. A literature search using CINAHL, PsycINFO, MEDLINE, and PubMed databases was conducted to locate peer-reviewed mood disorder research studies that measured serum 25-hydroxyvitamin D (25[OH]D) levels. Four of six studies reviewed imparted significant results, with all four showing an association between low 25(OH)D levels and higher incidences of four mood disorders: premenstrual syndrome, seasonal affective disorder, non-specified mood disorder, and major depressive disorder. This review indicates a possible biochemical mechanism occurring between vitamin D and mood disorders affecting women, warranting further studies of these variables using rigorous methodologies.

(10) Am J Geriatr Psychiatry. 2006 Dec;14(12):1032-40
Vitamin D deficiency is associated with low mood and worse cognitive performance in older adults.
Wilkins CH, Sheline YI, Roe CM, Birge SJ, Morris JC.
Department of Medicine, Division of Geriatrics and Nutritional Science, Alzheimer's Disease Research Center, Washington University School of Medicine, St. Louis, MO 63108, USA. cwilkins@im.wustl.edu

BACKGROUND: Vitamin D deficiency is common in older adults and has been implicated in psychiatric and neurologic disorders. This study examined the relationship among vitamin D status, cognitive performance, mood, and physical performance in older adults. METHODS: A cross-sectional group of 80 participants, 40 with mild Alzheimer disease (AD) and 40 nondemented persons, were selected from a longitudinal study of memory and aging. Cognitive function was assessed using the Short Blessed Test (SBT), Mini-Mental State Exam (MMSE), Clinical Dementia Rating (CDR; a higher Sum of Boxes score indicates greater dementia severity), and a factor score from a neuropsychometric battery; mood was assessed using clinician's diagnosis and the depression symptoms inventory. The Physical Performance Test (PPT) was used to measure functional status. Serum 25-hydroxyvitamin D levels were measured for all participants. RESULTS: The mean vitamin D level in the total sample was 18.58 ng/mL (standard deviation: 7.59); 58% of the participants had abnormally low vitamin D levels defined as less than 20 ng/mL. After adjusting for age, race, gender, and season of vitamin D determination, vitamin D deficiency was associated with presence of an active mood disorder (odds ratio: 11.69, 95% confidence interval: 2.04-66.86; Wald chi(2) = 7.66, df = 2, p = 0.022). Using the same covariates in a linear regression model, vitamin D deficiency was associated with worse performance on the SBT (F = 5.22, df = [2, 77], p = 0.044) and higher CDR Sum of Box scores (F = 3.20, df = [2, 77], p = 0.047) in the vitamin D-deficient group. There was no difference in performance on the MMSE, PPT, or factor scores between the vitamin D groups. CONCLUSIONS: In a cross-section of older adults, vitamin D deficiency was associated with low mood and with impairment on two of four measures of cognitive performance.

(11) J Neurol. 2006 Apr;253(4):464-70. Epub 2005 Nov 14.
Neuropsychological function in relation to serum parathyroid hormone and serum 25-hydroxyvitamin D levels. The Tromsø study.
Jorde R, Waterloo K, Saleh F, Haug E, Svartberg J.
Department of Internal Medicine, Institute of Clinical Medicine, University of Tromsø, Tromsø, Norway. rolf.jorde@unn.no

There are receptors for parathyroid hormone (PTH) and 1,25-dihydroxyvitamin D in the brain, and there are clinical and experimental data indicating that PTH and vitamin D may affect cerebral function. In the present study 21 subjects who both in the 5th Tromsø study and at a follow-up examination fulfilled criteria for secondary hyperparathyroidism (SHPT) without renal failure (serum calcium < 2.40 mmol/L, serum PTH > 6.4 pmol/L, and normal serum creatinine) and 63 control subjects were compared with tests for cognitive and emotional function. Those in the SHPT group had significantly impaired performance in 3 of 14 cognitive tests (Digit span forward, Stroop test part 1 and 2, and Word association test (FAS)) as compared with the controls, and also had a significantly higher depression score at the Beck Depression Inventory (BDI) (items 1-13). In a multiple linear regression model, a high serum PTH level was significantly associated with low performance at the Digit span forward, Stroop test part 1 and 2, and Digit Symbol tests. A low level of serum 25-hydroxyvitamin D was significantly associated with a high depression score. In conclusion, a deranged calcium metabolism appears to be associated with impaired function in several tests of neuropsychological function.

(12) Br J Nutr. 2003 May;89(5):552-72.
Vitamin D in preventive medicine: are we ignoring the evidence?
Zittermann A.
Department of Nutrition Science, University of Bonn, Endenicher Allee 11-13, 53115 Bonn, Germany. a.zittermann@uni-bonn.de

Vitamin D is metabolised by a hepatic 25-hydroxylase into 25-hydroxyvitamin D (25(OH)D) and by a renal 1alpha-hydroxylase into the vitamin D hormone calcitriol. Calcitriol receptors are present in more than thirty different tissues. Apart from the kidney, several tissues also possess the enzyme 1alpha-hydroxylase, which is able to use circulating 25(OH)D as a substrate. Serum levels of 25(OH)D are the best indicator to assess vitamin D deficiency, insufficiency, hypovitaminosis, adequacy, and toxicity. European children and young adults often have circulating 25(OH)D levels in the insufficiency range during wintertime. Elderly subjects have mean 25(OH)D levels in the insufficiency range throughout the year. In institutionalized subjects 25(OH)D levels are often in the deficiency range. There is now general agreement that a low vitamin D status is involved in the pathogenesis of osteoporosis. Moreover, vitamin D insufficiency can lead to a disturbed muscle function. Epidemiological data also indicate a low vitamin D status in tuberculosis, rheumatoid arthritis, multiple sclerosis, inflammatory bowel diseases, hypertension, and specific types of cancer. Some intervention trials have demonstrated that supplementation with vitamin D or its metabolites is able: (i) to reduce blood pressure in hypertensive patients; (ii) to improve blood glucose levels in diabetics; (iii) to improve symptoms of rheumatoid arthritis and multiple sclerosis. The oral dose necessary to achieve adequate serum 25(OH)D levels is probably much higher than the current recommendations of 5-15 microg/d.

(13) Am J Prev Med. 2007 Mar;32(3):210-6. Links
Optimal vitamin D status for colorectal cancer prevention: a quantitative meta analysis.Gorham ED, Garland CF, Garland FC, Grant WB, Mohr SB, Lipkin M, Newmark HL, Giovannucci E, Wei M, Holick MF.
University of California San Diego, Department of Family and Preventive Medicine, School of Medicine, La Jolla, California, USA. gorham@nhrc.navy.mil

BACKGROUND: Previous studies, such as the Women's Health Initiative, have shown that a low dose of vitamin D did not protect against colorectal cancer, yet a meta-analysis indicates that a higher dose may reduce its incidence. METHODS: Five studies of serum 25(OH)D in association with colorectal cancer risk were identified using PubMed. The results of all five serum studies were combined using standard methods for pooled analysis. The pooled results were divided into quintiles with median 25(OH)D values of 6, 16, 22, 27, and 37 ng/mL. Odds ratios were calculated by quintile of the pooled data using Peto's Assumption-Free Method, with the lowest quintile of 25(OH)D as the reference group. A dose-response curve was plotted based on the odds for each quintile of the pooled data. Data were abstracted and analyzed in 2006. RESULTS: Odds ratios for the combined serum 25(OH)D studies, from lowest to highest quintile, were 1.00, 0.82, 0.66, 0.59, and 0.46 (p(trend)<0.0001) for colorectal cancer. According to the DerSimonian-Laird test for homogeneity of pooled data, the studies were homogeneous (chi(2)=1.09, df=4, p=0.90. The pooled odds ratio for the highest quintile versus the lowest was 0.49 (p<0.0001, 95% confidence interval, 0.35-0.68). A 50% lower risk of colorectal cancer was associated with a serum 25(OH)D level > or =33 ng/mL, compared to < or =12 ng/mL. CONCLUSIONS: The evidence to date suggests that daily intake of 1000-2000 IU/day of vitamin D(3) could reduce the incidence of colorectal with minimal risk.

(14) Lung Cancer. 2007 Mar;55(3):263-70. Epub 2007 Jan 17.
Seasonal and geographical variations in lung cancer prognosis in Norway. Does Vitamin D from the sun play a role?
Porojnicu AC, Robsahm TE, Dahlback A, Berg JP, Christiani D, Bruland OS, Moan J.
Department of Radiation Biology, Institute for Cancer Research, Montebello, 0310 Oslo, Norway. a.c.porojnicu@usit.uio.no <a.c.porojnicu@usit.uio.no>

Vitamin D derivatives can modulate proliferation and differentiation of cancer cells. Our main source of Vitamin D is ultraviolet (UV) radiation-induced synthesis in skin following sun exposure. UV measurements show that the ambient annual UV exposures increase by about 50% from north to south in Norway. As judged from the incidence rates of squamous cell carcinoma, the same is true for the average personal UV exposures. Solar ultraviolet B (UVB) (280-320nm) exhibits a strong seasonal variation with a minimum during the winter months. The present work aims at investigating the impact of season of diagnosis and residential region, both influencing the Vitamin D level, on the risk of death from lung cancer in patients diagnosed in Norway. Data on all incident cases of lung cancer between 1964 and 2000 were collected. Risk estimates were calculated as relative risk (RR), with 95% confidence intervals using Cox regression model. The seasonal variation of 25-hydroxyvitamin D was assessed from routine measurements of 15,616 samples performed at The Hormone Laboratory of Aker University Hospital. Our results indicate that season of diagnosis is of prognostic value for lung cancer patients, with a approximately 15% lower case fatality for young male patients diagnosed during autumn versus winter (RR=0.85; 95% CI, -0.73 to 0.99; p=0.04). Residing in a high UV region resulted in a further lowering of the death risk than residing in a low UV region. We propose, in agreement with earlier findings for prostate-, breast- colon cancer and Hodgkins lymphoma, that a high level of sun-induced 25-hydroxyvitamin D can be a prognostic advantage for certain groups of lung cancer patients, notably for young men. Lung cancer has for several decades been the leading cause of cancer-related mortality in men in Norway and during the last two decades, became the second most common cause of cancer-related death in women . There are two main types of lung cancer: small cell lung cancer for which chemotherapy is the primary treatment and non-small cell lung cancer, which in its early stages is treated primarily with surgery. Gender-related differences have been described in the literature with respect to survival after therapy, male gender being a significant independent negative prognostic factor . In Norway the 5 years relative survival for localized tumours is about 30% for females and 20% for males. Calcitriol, which is the most active form of Vitamin D, is involved in key regulatory processes such as proliferation, differentiation and apoptosis in a wide variety of cells . Mechanisms for these actions have been proposed to be the interaction of active Vitamin D derivatives with a specific nuclear receptor (VDR receptor) and/or with membrane targets . In vitro studies, performed with lung cancer cell lines, have shown an inhibitive effect of Vitamin D derivatives on cell-growth and proliferation . Furthermore, animal studies have demonstrated the capability of these compounds to suppress invasion, metastasis and angiogenesis in vivo , suggesting that administration of Vitamin D derivatives may be used as adjuvant therapy for lung cancer. Humans get optimal Vitamin D levels by exposure to sun or artificial ultraviolet B (UVB, 280-320nm) sources , and possibly also by consumption of food rich in this nutrient (fat fish, eggs, margarine, etc.) or of vitamin supplements . Among these sources, solar radiation appears to be the most important one . Thus, the Vitamin D status (assessed by the serum levels of 25-hydroxyvitamin D, calcidiol) exhibits a strong seasonal variation that parallels the seasonal change in the fluence of solar UVB that reaches the ground. During winter, the UVB fluence rate in the Nordic countries (50-71 degrees N) is below the level required for Vitamin D synthesis in skin . The maximal level of calcidiol is reached between the months July and September, and is 20-120% higher than the corresponding winter level . Recently we hypothesised that the seasonal variation of calcidiol might be of prognostic significance for colon-, breast- prostate cancer as well as for Hodgkins lymphoma in Norway. Patients diagnosed during summer and autumn have a better survival after standard treatment than patients diagnosed during the winter season . This might be a consequence of a higher Vitamin D level. An American study investigated the effect of season of surgery and recent Vitamin D intake on the survival of non-small cell lung cancer patients. The authors reported a significant beneficial joint effect of summer season and high Vitamin D intake compared with winter season and low Vitamin D intake while Vitamin D intake alone did not affect prognosis. Similar results were recently reported from a large study in United Kingdom involving over a million cancer patients including over 190,000 patients diagnosed with lung cancer . Norway (58-71 degrees N) has a significant north-south variation in UV fluence. This makes the country suitable for studies relating cancer epidemiology to UV levels . We investigated whether variations in UV, and, consequently, in Vitamin D level, influence the prognosis of lung cancer, using season of diagnosis and residential regions as variables. Survival data obtained for patients diagnosed over a 40 years period were compared with variations in serum Vitamin D levels obtained from routine measurements performed in The Hormone Laboratory of Aker University Hospital during the period 1996-2001. Seasonal and gender variations in Vitamin D level have been estimated from the analyses.

(15) Anticancer Res. 2006 Jul-Aug;26(4A):2567-72.
Vitamin D metabolism in human prostate cells: implications for prostate cancer chemoprevention by vitamin D.
Flanagan JN, Young MV, Persons KS, Wang L, Mathieu JS, Whitlatch LW, Holick MF, Chen TC.
Department of Medicine, Section of Endocrinology, Diabetes and Nutrition, Boston University School of Medicine, Boston, MA 02118, USA.

BACKGROUND: Prostate cells can produce 1alpha,25-dihydroxyvitamin D3 (1alpha,25(OH)2D3) from 25-hydroxyvitamin D3 (25(OH)D3) to regulate their own growth. Here, the questions of whether prostate cells express vitamin D-25-hydroxylase (25-OHase) and can convert vitamin D3 to 1alpha,25(OH)2D3 were investigated. MATERIALS AND METHODS: Protein and receptor binding assays were used to determine 25(OH)D3 and 1alpha,25(OH)2D3, respectively. Measurements of proliferation by 3H-thymidine incorporation, and 1alpha,25(OH)2D-responsive gene expression by real-time qPCR and by Western blot were used as functional assays for the presence of 25-OHase activity. RESULTS: Prostate cells metabolized vitamin D3 to 1alpha,25(OH)2D3. Vitamin D3 up-regulated 25(OH)D-24R-hydroxylase and IGFBP3, two 1alpha,25(OH)2D-responsive genes, in prostate cells. CYP2R1 was the major form of 25-OHase expressed in normal and cancerous prostate cells as determined by qPCR. CONCLUSION: The autocrine synthesis of 1alpha,25(OH)2D3 from vitamin D3 suggests that maintaining adequate levels of serum vitamin D could be a safe and effective chemo-preventive measure to decrease the risk of prostate cancer.

(16) Am J Clin Nutr. 2008 Jun;87(6):1952-8.
Vitamin D intake to attain a desired serum 25-hydroxyvitamin D concentration.
Aloia JF, Patel M, Dimaano R, Li-Ng M, Talwar SA, Mikhail M, Pollack S, Yeh JK.
Bone Mineral Research Center, Winthrop University Hospital, Mineola, NY, USA.

BACKGROUND: Indirect evidence suggests that optimal vitamin D status is achieved with a serum 25-hydroxyvitamin D [25(OH)D] concentration >75 nmol/L. OBJECTIVE: We aimed to determine the intake of vitamin D(3) needed to raise serum 25(OH)D to >75 nmol/L. DESIGN: The design was a 6-mo, prospective, randomized, double-blinded, double-dummy, placebo-controlled study of vitamin D(3) supplementation. Serum 25(OH)D was measured by radioimmunoassay. Vitamin D(3) intake was adjusted every 2 mo by use of an algorithm based on serum 25(OH)D concentration. RESULTS: A total of 138 subjects entered the study. After 2 dose adjustments, almost all active subjects attained concentrations of 25(OH)D >75 nmol/L, and no subjects exceeded 220 nmol/L. The mean (+/-SD) slope at 9 wk [defined as 25(OH)D change/baseline dose] was 0.66 +/- 0.35 (nmol/L)/(microg/d) and did not differ statistically between blacks and whites. The mean daily dose was 86 microg (3440 IU). The use of computer simulations to obtain the most participants within the range of 75-220 nmol/L predicted an optimal daily dose of 115 microg/d (4600 IU). No hypercalcemia or hypercalciuria was observed. CONCLUSIONS: Determination of the intake required to attain serum 25(OH)D concentrations >75 nmol/L must consider the wide variability in the dose-response curve and basal 25(OH)D concentrations. Projection of the dose-response curves observed in this convenience sample onto the population of the third National Health and Nutrition Examination Survey suggests a dose of 95 microg/d (3800 IU) for those above a 25(OH)D threshold of 55 nmol/L and a dose of 125 microg/d (5000 IU) for those below that threshold.

(17) J Steroid Biochem Mol Biol. 2004 May;89-90(1-5):575-9.
Why the optimal requirement for Vitamin D3 is probably much higher than what is officially recommended for adults.
Vieth R.
Department of Laboratory Medicine and Pathobiology, University of Toronto, and Pathology and Laboratory Medicine, Mount Sinai Hospital, Toronto, Canada M5G 1X5. rvieth@mtsinai.on.ca

The physiologic range for circulating 25-hydroxyvitamin D3 [25(OH)D; the measure of Vitamin D nutrient status] concentration in humans and other primates extends to beyond 200 nmol/L (>80 ng/mL). This biologic "normal" value is greater than current population norms for 25(OH)D. Concentrations of 25(OH)D that correlate with desirable effects extend to at least 70 nmol/L, with no obvious threshold. Randomized clinical trials using 20 mcg (800 IU) per day of Vitamin D show that this suppresses parathyroid hormone, preserves bone mineral density, prevents fractures, lowers blood pressure and improves balance. Calcium absorption from diet correlates with 25(OH)D in the normal range. Health effects of Vitamin D beyond osteoporosis are mostly supported by the circumstantial evidence of epidemiologic studies and laboratory research. These include prevention of cancer and the autoimmune diseases, insulin-dependent diabetes and multiple sclerosis. One mcg per day of Vitamin D(3) (cholecalciferol) increases circulating 25(OH)D by about 1 nmol/L (0.4 ng/mL). A recommended dietary allowance (RDA) is the long-term daily intake level that meets the total requirements for the nutrient by nearly all healthy individuals (it would presume no sunshine). If 70 nmol/L is regarded as a minimum desirable target 25(OH)D concentration, then current recommendations of 15 mcg per day do not meet the criterion of an RDA.

(18) J Bone Miner Res. 2007 Dec;22 Suppl 2:V64-8.
Vitamin D toxicity, policy, and science.
Vieth R.
Departments of Nutritional Sciences, and Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Canada.

The serum 25-hydroxyvitamin D [25(OH)D] concentration that is the threshold for vitamin D toxicity has not been established. Hypercalcemia is the hazard criterion for vitamin D. Past policy of the Institute of Medicine has set the tolerable upper intake level (UL) for vitamin D at 50 mug (2000 IU)/d, defining this as "the highest level of daily nutrient intake that is likely to pose no risks of adverse health effects to almost all individuals in the general population." However, because sunshine can provide an adult with vitamin D in an amount equivalent to daily oral consumption of 250 mug (10,000 IU)/d, this is intuitively a safe dose. The incremental consumption of 1 mug (40 IU)/day of vitamin D(3) raises serum 25(OH)D by approximately 1 nM (0.4 ng/ml). Therefore, if sun-deprived adults are to maintain serum 25(OH)D concentrations >75 nM (30 ng/ml), they will require an intake of more than the UL for vitamin D. The mechanisms that limit vitamin D safety are the capacity of circulating vitamin D-binding protein and the ability to suppress 25(OH)D-1-alpha-hydroxylase. Vitamin D causes hypercalcemia when the "free" concentration of 1,25-dihydroxyvitamin D is inappropriately high. This displacement of 1,25(OH)(2)D becomes excessive as plasma 25(OH)D concentrations become higher than at least 600 nM (240 ng/ml). Plasma concentrations of unmetabolized vitamin D during the first days after an acute, large dose of vitamin D can reach the micromolar range and cause acute symptoms. The clinical trial evidence shows that a prolonged intake of 250 mug (10,000 IU)/d of vitamin D(3) is likely to pose no risk of adverse effects in almost all individuals in the general population; this meets the criteria for a tolerable upper intake level.

(19) Ann Clin Biochem. 2008 Jan;45(Pt 1):106-10.
Self-prescribed high-dose vitamin D3: effects on biochemical parameters in two men.
Kimball S, Vieth R.
Department of Nutritional Sciences, University of Toronto, Toronto, Canada. samantha.kimball@utoronto.ca

The lowest observed adverse effect level for vitamin D, said to cause hypercalcaemia in normal adults, is officially 95 microg/day. Serum 25-hydroxyvitamin D (25[OH]D) concentrations associated with hypervitaminosis D remain undefined. Reported 25(OH)D concentrations resulting from prolonged excessive vitamin D3 intakes have exceeded 700 nmol/L. We report self-prescribed high dose of vitamin D3 over 5-6 years by two men. Subject 1 had been taking 100 microg/day for 3 years followed by 3 years of 200 microg/day. Serum 25(OH)D concentrations averaged 130 nmol/L while taking 100 microg/day of vitamin D3. While taking 200 microg/day of vitamin D3, mean serum 25(OH)D concentrations were 260 nmol/L with no hypercalcaemia or hypercalcuria over the 6 years of vitamin D3 intake. Subject 2 was a 39-year-old man diagnosed with multiple sclerosis. He initiated his own dose-escalation schedule. His vitamin D3 intake increased from 200 to 2200 microg/day over 4 years. The first evidence of a potential adverse effect was that urinary calcium:creatinine ratios showed an increasing trend, which preceded serum calcium concentrations above the reference range (2.2-2.6 mmol/L). His serum 25(OH)D concentration was 1126 nmol/L when total serum calcium reached 2.63 mmol/L. He stopped vitamin D3 supplementation at this point. Two months later, all biochemistry values were within reference ranges; serum 25(OH)D concentrations fell by about one-half, to 656 nmol/L. These results help to clarify the human response to higher intakes of vitamin D3. Close monitoring of biochemical responses confirmed that an increase in urinary calcium:creatinine ratio precedes hypercalcaemia as serum 25(OH)D concentrations rise.


 

Thread

 

Post a new follow-up

Your message only Include above post


[885191]

Notify the administrators

They will then review this post with the posting guidelines in mind.

To contact them about something other than this post, please use this form instead.

 

Start a new thread

 
Google
dr-bob.org www
Search options and examples
[amazon] for
in

This thread | Show all | Post follow-up | Start new thread | FAQ
Psycho-Babble Medication | Framed

poster:Garnet71 thread:885191
URL: http://www.dr-bob.org/babble/20090313/msgs/885191.html