id
stringlengths 6
57
| question
null | answer
stringlengths 3
13k
|
---|---|---|
nfcorpus-corpus-MED-3450 | null | Exercise-induced oxidative stress:myths, realities and physiological relevance.
Although assays for the most popular markers of exercise-induced oxidative stress may experience methodological flaws, there is sufficient credible evidence to suggest that exercise is accompanied by an increased generation of free radicals, resulting in a measurable degree of oxidative modifications to various molecules. However, the mechanisms responsible are unclear. A common assumption that increased mitochondrial oxygen consumption leads per se to increased reactive oxygen species (ROS) production is not supported by in vitro and in vivo data. The specific contributions of other systems (xanthine oxidase, inflammation, haem protein auto-oxidation) are poorly characterised. It has been demonstrated that ROS have the capacity to contribute to the development of muscle fatigue in situ, but there is still a lack of convincing direct evidence that ROS impair exercise performance in vivo in humans. It remains unclear whether exercise-induced oxidative modifications have little significance, induce harmful oxidative damage, or are an integral part of redox regulation. It is clear that ROS play important roles in numerous physiological processes at rest; however, the detailed physiological functions of ROS in exercise remain to be elucidated. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3451 | null | Could a vegetarian diet reduce exercise-induced oxidative stress? A review of the literature.
Oxidative stress is a natural physiological process that describes an imbalance between free radical production and the ability of the antioxidant defence system of the body to neutralize free radicals. Free radicals can be beneficial as they may promote wound healing and contribute to a healthy immune response. However, free radicals can have a detrimental impact when they interfere with the regulation of apoptosis and thus play a role in the promotion of some cancers and conditions such as cardiovascular disease. Antioxidants are molecules that reduce the damage associated with oxidative stress by counteracting free radicals. Regular exercise is a vital component of a healthy lifestyle, although it can increase oxidative stress. As a typical vegetarian diet comprises a wide range of antioxidant-rich foods, it is plausible that the consumption of these foods will result in an enhanced antioxidant system capable of reducing exercise-induced oxidative stress. In addition, a relationship between a vegetarian diet and lower risks of cardiovascular disease and some cancers has been established. This review explores the current available evidence linking exercise, vegetarians, antioxidants, and oxidative stress. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3452 | null | Fostering antioxidant defences: up-regulation of antioxidant genes or antioxidant supplementation?
Vitamins have traditionally been considered as food components that are required in the normal diet to prevent deficiencies. However, a newer concept of the function of vitamins in nutrition has taken them beyond simply prevention of deficiency symptoms. This concept considers that many vitamins, when taken in relatively large doses, have important functions beyond preventing deficiencies. Linus Pauling was instrumental in putting forward this concept, particularly for vitamin C. Thus, relatively high intakes of vitamins, and in particular vitamins C and E which are antioxidants, are considered to be healthy for the human population. This may be true in some special situations such as, for instance, the prevention of Alzheimer's disease progression. However, recent epidemiological evidence has not supported the claim that antioxidant vitamins increase well-being and prolong life span. In fact, vitamin supplementation may be even detrimental and reduce life span. A new concept that we would like to put forward is that nutrients up-regulate the endogenous antioxidant defences. This is particularly true in the case of phytoestrogens for example, which bind to oestrogen receptors and eventually up-regulate the expression of antioxidant genes. In this review we discuss the pros and cons of antioxidant vitamin supplementation and also the possibility that the ingestion of some nutrients may be very effective in increasing antioxidant defences by up-regulating the activity of antioxidant enzymes which are normally present in the cell. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3453 | null | Supplementation with vitamin C and N-acetyl-cysteine increases oxidative stress in humans after an acute muscle injury induced by eccentric exercise.
There has been no investigation to determine if the widely used over-the-counter, water-soluble antioxidants vitamin C and N-acetyl-cysteine (NAC) could act as pro-oxidants in humans during inflammatory conditions. We induced an acute-phase inflammatory response by an eccentric arm muscle injury. The inflammation was characterized by edema, swelling, pain, and increases in plasma inflammatory indicators, myeloperoxidase and interleukin-6. Immediately following the injury, subjects consumed a placebo or vitamin C (12.5 mg/kg body weight) and NAC (10 mg/kg body weight) for 7 d. The resulting muscle injury caused increased levels of serum bleomycin-detectable iron and the amount of iron was higher in the vitamin C and NAC group. The concentrations of lactate dehydrogenase (LDH), creatine kinase (CK), and myoglobin were significantly elevated 2, 3, and 4 d postinjury and returned to baseline levels by day 7. In addition, LDH and CK activities were elevated to a greater extent in the vitamin C and NAC group. Levels of markers for oxidative stress (lipid hydroperoxides and 8-iso prostaglandin F2alpha; 8-Iso-PGF2alpha) and antioxidant enzyme activities were also elevated post-injury. The subjects receiving vitamin C and NAC had higher levels of lipid hydroperoxides and 8-Iso-PGF2alpha 2 d after the exercise. This acute human inflammatory model strongly suggests that vitamin C and NAC supplementation immediately post-injury, transiently increases tissue damage and oxidative stress. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3454 | null | Endurance exercise results in DNA damage as detected by the comet assay.
To determine if 6 weeks of supplementation with antioxidants could alleviate exercise-induced DNA damage, we studied 21 runners during a 50 km ultramarathon. Subjects were randomly assigned to one of two groups: (1) placebos (PL) or (2) antioxidants (AO) (1000 mg vitamin C and 400 IU RRR-alpha-tocopheryl acetate). The comet assay was used to assess DNA damage in circulating leukocytes at selected time points: pre-, mid-, and 2 h postrace and daily for 6 days postrace. All subjects completed the race: run time 7.1 +/- 0.1 h, energy expenditure 5008 +/- 80 kcal for women (n = 10) and 6932 +/- 206 kcal for men (n = 11). Overall, the percentage DNA damage increased at midrace (p <.02), but returned to baseline by 2 h postrace, indicating that the exercise bout induced nonpersistent DNA damage. There was a gender x treatment x time interaction (p <.01). One day postrace, women taking AO had 62% less DNA damage than women taking PL (p <.0008). In contrast, there were no statistically significant differences between the two treatment groups of men at any time point. Thus, endurance exercise resulted in DNA damage as shown by the comet assay and AO seemed to enhance recovery in women but not in men. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3455 | null | Exercise-induced lipid peroxidation: Implications for deoxyribonucleic acid damage and systemic free radical generation.
Exercise-induced deoxyribonucleic acid (DNA) damage is often associated with an increase in free radicals; however, there is a lack of evidence examining the two in parallel. This study tested the hypothesis that high-intensity exercise has the ability to produce free radicals that may be capable of causing DNA damage. Twelve apparently healthy male subjects (age: 23 ± 4 years; stature: 181 ± 8 cm; body mass: 80 ± 9 kg; and VO(2max) : 49 ± 5 ml/kg/min) performed three 5 min consecutive and incremental stages (40, 70, and 100% of VO(2max) ) of aerobic exercise with a 15-min period separating each stage. Blood was drawn after each bout of exercise for the determination of ex vivo free radicals, DNA damage, protein carbonyls, lipid hydroperoxide (LOOH) concentration, and a range of lipid-soluble antioxidants. Lipid-derived oxygen-centered free radicals (hyperfine coupling constants a(Nitrogen) = 13.7 Gauss (G) and aβ(Hydrogen) = 1.8 G) increased as a result of acute moderate and high-intensity exercise (P < 0.05), while DNA damage was also increased (P < 0.05). Systemic changes were observed in LOOH and for lipid-soluble antioxidants throughout exercise (P < 0.05); however, there was no observed change in protein carbonyl concentration (P > 0.05). These findings identify lipid-derived free radical species as possible contributors to peripheral mononuclear cell DNA damage in the human exercising model. This damage occurs in the presence of lipid oxidation but in the absence of any change to protein carbonyl concentration. The significance of these findings may have relevance in terms of immune function, the aging process, and the pathology of carcinogenesis. Copyright © 2010 Wiley-Liss, Inc. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3456 | null | Acute and chronic watercress supplementation attenuates exercise-induced peripheral mononuclear cell DNA damage and lipid peroxidation.
Pharmacological antioxidant vitamins have previously been investigated for a prophylactic effect against exercise-induced oxidative stress. However, large doses are often required and may lead to a state of pro-oxidation and oxidative damage. Watercress contains an array of nutritional compounds such as β-carotene and α-tocopherol which may increase protection against exercise-induced oxidative stress. The present randomised controlled investigation was designed to test the hypothesis that acute (consumption 2 h before exercise) and chronic (8 weeks consumption) watercress supplementation can attenuate exercise-induced oxidative stress. A total of ten apparently healthy male subjects (age 23 (SD 4) years, stature 179 (SD 10) cm and body mass 74 (SD 15) kg) were recruited to complete the 8-week chronic watercress intervention period (and then 8 weeks of control, with no ingestion) of the experiment before crossing over in order to compete the single-dose acute phase (with control, no ingestion). Blood samples were taken at baseline (pre-supplementation), at rest (pre-exercise) and following exercise. Each subject completed an incremental exercise test to volitional exhaustion following chronic and acute watercress supplementation or control. The main findings show an exercise-induced increase in DNA damage and lipid peroxidation over both acute and chronic control supplementation phases (P< 0.05 v. supplementation), while acute and chronic watercress attenuated DNA damage and lipid peroxidation and decreased H₂O₂ accumulation following exhaustive exercise (P< 0.05 v. control). A marked increase in the main lipid-soluble antioxidants (α-tocopherol, γ-tocopherol and xanthophyll) was observed following watercress supplementation (P< 0.05 v. control) in both experimental phases. These findings suggest that short- and long-term watercress ingestion has potential antioxidant effects against exercise-induced DNA damage and lipid peroxidation. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3457 | null | Oxidative DNA damage in human peripheral leukocytes induced by massive aerobic exercise.
Reactive oxygen species produced during vigorous exercise may permeate into cell nuclei and induce oxidative DNA damage, but the supporting evidence is still lacking. By using a 42 km marathon race as a model of massive aerobic exercise, we demonstrated a significant degree of unrepaired DNA base oxidation in peripheral immunocompetent cells, despite a concurrent increase in the urinary excretion of 8-hydroxy-2'-deoxyguanosine. Single cell gel electrophoresis with the incorporation of lesion-specific endonucleases further revealed that oxidized pyrimidines (endonuclease III-sensitive sites) contributed to most of the postexercise nucleotide oxidation. The oxidative DNA damage correlated significantly with plasma levels of creatinine kinase and lipid peroxidation metabolites, and lasted for more than 1 week following the race. This phenomenon may be one of the mechanisms behind the immune dysfunctions after exhaustive exercise. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3458 | null | Does physical activity induce DNA damage?
The single cell gel electrophoresis (SCG) assay (comet assay) is a sensitive technique for detecting the presence of DNA strand-breaks and alkali-labile damage in individual cells. This technique was used to study peripheral blood cells from three volunteers after physical activity. The test subjects had to run on a treadmill and were checked for blood pressure and ECG, lactate concentration and creatine kinase activity. Blood was taken before and several times during and after the run. In a first multiple step test, the volunteers ran as long as possible with increasing speed. In a second test they had to run for 45 min with a fixed individual speed which was defined to ensure an aerobic metabolism. In the first test, the white blood cells of all subjects showed increased DNA migration in the SCG assay. The effect was seen 6 h after the end of the exercise and reached its maximum 24 h later. After 72 h, DNA migration decreased to about control level. The distribution of DNA migration among cells clearly demonstrated that the majority of white blood cells exhibited increased DNA migration and that the effect was not only due to a small fraction of damaged cells. From the same blood samples, blood cultures were set up to study a possible effect on the frequency of sister chromatid exchanges (SCE), another indicator for genotoxic effects. However, there was no significant increase in SCE in any of the cultures. In the second exercise, during aerobic metabolism, the effect on DNA migration was not seen.(ABSTRACT TRUNCATED AT 250 WORDS) |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3585 | null | The spermicidal potency of Coca-Cola and Pepsi-Cola.
The inhibitory effect of Old Coke, caffeine-free New Coke, New Coke, Diet Coke and Pepsi-Cola on human sperm motility was studied with a trans-membrane migration method. None of them could decrease sperm motility to less than 70% of control within one hour. A previous study which claimed a marked variation of spermicidal potencies among different formulations of Coca-Cola could not be confirmed. Even if cola has a spermicidal effect, its potency is relatively weak as compared with other well-known spermicidal agents. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3586 | null | Dietary fat and semen quality among men attending a fertility clinic
BACKGROUND The objective of this study was to examine the relation between dietary fats and semen quality parameters. METHODS Data from 99 men with complete dietary and semen quality data were analyzed. Fatty acid levels in sperm and seminal plasma were measured using gas chromatography in a subgroup of men (n = 23). Linear regression was used to determine associations while adjusting for potential confounders. RESULTS Men were primarily Caucasian (89%) with a mean (SD) age of 36.4 (5.3) years; 71% were overweight or obese; and 67% were never smokers. Higher total fat intake was negatively related to total sperm count and concentration. Men in the highest third of total fat intake had 43% (95% confidence interval (CI): 62–14%) lower total sperm count and 38% (95% CI: 58–10%) lower sperm concentration than men in the lowest third (Ptrend = 0.01). This association was driven by intake of saturated fats. Levels of saturated fatty acids in sperm were also negatively related to sperm concentration (r= −0.53), but saturated fat intake was unrelated to sperm levels (r = 0.09). Higher intake of omega-3 polyunsaturated fats was related to a more favorable sperm morphology. Men in the highest third of omega-3 fatty acids had 1.9% (0.4–3.5%) higher normal morphology than men in the lowest third (Ptrend = 0.02). CONCLUSIONS In this preliminary cross-sectional study, high intake of saturated fats was negatively related to sperm concentration whereas higher intake of omega-3 fats was positively related to sperm morphology. Further, studies with larger samples are now required to confirm these findings. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3587 | null | The question of declining sperm density revisited: an analysis of 101 studies published 1934-1996.
In 1992 Carlsen et al. reported a significant global decline in sperm density between 1938 and 1990 [Evidence for Decreasing Quality of Semen during Last 50 Years. Br Med J 305:609-613 (1992)]. We subsequently published a reanalysis of the studies included by Carlsen et al. [Swan et al. Have Sperm Densities Declined? A Reanalysis of Global Trend Data. Environ Health Perspect 105:1228-1232 (1997)]. In that analysis we found significant declines in sperm density in the United States and Europe/Australia after controlling for abstinence time, age, percent of men with proven fertility, and specimen collection method. The declines in sperm density in the United States (approximately 1.5%/year) and Europe/Australia (approximately 3%/year) were somewhat greater than the average decline reported by Carlsen et al. (approximately 1%/year). However, we found no decline in sperm density in non-Western countries, for which data were very limited. In the current study, we used similar methods to analyze an expanded set of studies. We added 47 English language studies published in 1934-1996 to those we had analyzed previously. The average decline in sperm count was virtually unchanged from that reported previously by Carlsen et al. (slope = -0.94 vs. -0.93). The slopes in the three geographic groupings were also similar to those we reported earlier. In North America, the slope was somewhat less than the slope we had found for the United States (slope = -0.80; 95% confidence interval (CI), -1.37--0.24). Similarly, the decline in Europe (slope = -2.35; CI, -3.66--1.05) was somewhat less than reported previously. As before, studies from other countries showed no trend (slope = -0.21; CI, -2.30-1.88). These results are consistent with those of Carlsen et al. and our previous results, suggesting that the reported trends are not dependent on the particular studies included by Carlsen et al. and that the observed trends previously reported for 1938-1990 are also seen in data from 1934-1996. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3588 | null | Caffeinated and alcoholic beverage intake in relation to ovulatory disorder infertility
Background Many studies have examined whether caffeine, alcohol, or specific beverages containing these affect fertility in women. However most of these studies have retrospectively collected information on alcohol and caffeine intake, making the results susceptible to biases. Methods We followed 18,555 married women without a history of infertility for 8 years as they attempted to become (or became) pregnant. Diet was measured twice during this period and prospectively related to the incidence of ovulatory disorder infertility. Results There were 438 incident report of ovulatory disorder infertility during follow-up. Intakes of alcohol and caffeine were unrelated to the risk of ovulatory disorder infertility. The multivariate-adjusted relative risk (RR), 95% confidence interval (CI), P for trend comparing the highest to lowest categories of intake were 1.11 (0.76–1.64; 0.78) for alcohol and 0.86 (0.61–1.20; 0.44) for total caffeine. However, intake of caffeinated soft drinks was positively related to ovulatory disorder infertility. The multivariate-adjusted RR 95% CI, and P for trend comparing the highest to lowest categories of caffeinated soft drink consumption were 1.47 (1.09–1.98; 0.01). Similar associations were observed for noncaffeinated, sugared, diet and total soft drinks. Conclusions Our findings do not support the hypothesis that alcohol and caffeine impair ovulation to the point of decreasing fertility. The association between soft drinks and ovulatory disorder infertility appears not to be attributable to their caffeine or sugar content, and deserves further investigation. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3589 | null | Food intake and its relationship with semen quality: a case-control study.
OBJECTIVE: To compare dietary habits in normospermic and oligoasthenoteratospermic patients attending a reproductive assisted clinic. DESIGN: An observational, analytical case-control study. SETTING: Private fertility clinics. PATIENT(S): Thirty men with poor semen quality (cases) and 31 normospermic control couples attending our fertility clinics. INTERVENTION(S): We recorded dietary habits and food consumption using a food frequency questionnaire adapted to meet specific study objectives. Analysis of semen parameters, hormone levels, Y microdeletions, and karyotypes were also carried out. MAIN OUTCOME MEASURE(S): Frequency of intake food items were registered in a scale with nine categories ranging from no consumption to repeated daily consumption. RESULT(S): Controls had a higher intake of skimmed milk, shellfish, tomatoes, and lettuce, and cases consumed more yogurt, meat products, and potatoes. In the logistic regression model cases had lower intake of lettuce and tomatoes, fruits (apricots and peaches), and significantly higher intake of dairy and meat processed products. CONCLUSION(S): Frequent intake of lipophilic foods like meat products or milk may negatively affect semen quality in humans, whereas some fruits or vegetables may maintain or improve semen quality. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3590 | null | Male reproductive organs are at risk from environmental hazards
Male reproductive disorders that are of interest from an environmental point of view include sexual dysfunction, infertility, cryptorchidism, hypospadias and testicular cancer. Several reports suggest declining sperm counts and increase of these reproductive disorders in some areas during some time periods past 50 years. Except for testicular cancer this evidence is circumstantial and needs cautious interpretation. However, the male germ line is one of the most sensitive tissues to the damaging effects of ionizing radiation, radiant heat and a number of known toxicants. So far occupational hazards are the best documented risk factors for impaired male reproductive function and include physical exposures (radiant heat, ionizing radiation, high frequency electromagnetic radiation), chemical exposures (some solvents as carbon disulfide and ethylene glycol ethers, some pesticides as dibromochloropropane, ethylendibromide and DDT/DDE, some heavy metals as inorganic lead and mercury) and work processes such as metal welding. Improved working conditions in affluent countries have dramatically decreased known hazardous workplace exposures, but millions of workers in less affluent countries are at risk from reproductive toxicants. New data show that environmental low-level exposure to biopersistent pollutants in the diet may pose a risk to people in all parts of the world. For other toxicants the evidence is only suggestive and further evaluation is needed before conclusions can be drawn. Whether compounds as phthalates, bisphenol A and boron that are present in a large number of industrial and consumer products entails a risk remains to be established. The same applies to psychosocial stressors and use of mobile phones. Finally, there are data indicating a particular vulnerability of the fetal testis to toxicants—for instance maternal tobacco smoking. Time has come where male reproductive toxicity should be addressed form entirely new angles including exposures very early in life. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3591 | null | Perinatal Exposure to Low Doses of Dioxin Can Permanently Impair Human Semen Quality
Background In recent decades, young men in some industrialized areas have reportedly experienced a decrease in semen quality. Objective We examined effects of perinatal dioxin exposure on sperm quality and reproductive hormones. Methods We investigated sperm quality and hormone concentrations in 39 sons (mean age, 22.5 years) born between 1977 and 1984 to mothers exposed to dioxin after the accident in Seveso, Italy (1976), and 58 comparisons (mean age, 24.6 years) born to mothers exposed only to background dioxin. Maternal dioxin levels at conception were extrapolated from the concentrations measured in 1976 serum samples. Results The 21 breast-fed sons whose exposed mothers had a median serum dioxin concentration as low as 19 ppt at conception had lower sperm concentration (36.3 vs. 86.3 million/mL; p = 0.002), total count (116.9 vs. 231.1; p = 0.02), progressive motility (35.8 vs. 44.2%; p = 0.03), and total motile count (38.7 vs. 98 million; p = 0.01) than did the 36 breast-fed comparisons. The 18 formula-fed exposed and the 22 formula-fed and 36 breast-fed comparisons (maternal dioxin background 10 ppt at conception) had no sperm-related differences. Follicle-stimulating hormone was higher in the breast-fed exposed group than in the breast-fed comparisons (4.1 vs. 2.63 IU/L; p = 0.03) or the formula-fed exposed (4.1 vs. 2.6 IU/L; p = 0.04), and inhibin B was lower (breast-fed exposed group, 70.2; breast-fed comparisons, 101.8 pg/mL, p = 0.01; formula-fed exposed, 99.9 pg/mL, p = 0.02). Conclusions In utero and lactational exposure of children to relatively low dioxin doses can permanently reduce sperm quality. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3592 | null | Heavy metals in commercial fish in New Jersey.
Levels of contaminants in fish are of particular interest because of the potential risk to humans who consume them. While attention has focused on self-caught fish, most of the fish eaten by the American public comes from commercial sources. We sampled 11 types of fish and shellfish obtained from supermarkets and specialty fish markets in New Jersey and analyzed them for arsenic, cadmium, chromium, lead, manganese, mercury, and selenium. We test the null hypothesis that metal levels do not vary among fish types, and we consider whether the levels of any metals could harm the fish themselves or their predators or pose a health risk for human consumers. There were significant interspecific differences for all metals, and no fish types had the highest levels of more than two metals. There were few significant correlations (Kendall tau) among metals for the three most numerous fish (yellowfin tuna, bluefish, and flounder), the correlations were generally low (below 0.40), and many correlations were negative. Only manganese and lead positively were correlated for tuna, bluefish, and flounder. The levels of most metals were below those known to cause adverse effects in the fish themselves. However, the levels of arsenic, lead, mercury, and selenium in some fish were in the range known to cause some sublethal effects in sensitive predatory birds and mammals and in some fish exceeded health-based standards. The greatest risk from different metals resided in different fish; the species of fish with the highest levels of a given metal sometimes exceeded the human health guidance or standards for that metal. Thus, the risk information given to the public (mainly about mercury) does not present a complete picture. The potential of harm from other metals suggests that people not only should eat smaller quantities of fish known to accumulate mercury but also should eat a diversity of fish to avoid consuming unhealthy quantities of other heavy metals. However, consumers should bear in mind that standards have a margin of safety. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3593 | null | Correlation of lead, cadmium and mercury levels in tissue and liver samples with age in cattle.
The aim of this study was to determine the accumulation of selected heavy metals (Pb, Cd, Hg, As) in meat and liver of cattle. The animals were divided into four age-groups which allowed the analysis of statistical-mathematical correlations between the age of the animals and contamination of meat. The research material for determination of heavy metal levels was taken from the longissimus back muscle (m. longissimus dorsi) and samples from the tail lobe of the liver. Analysis showed that contamination by Cd and Pb is clearly dependent on the age of the animal. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-4954 | null | Semen quality of fertile US males in relation to their mothers' beef consumption during pregnancy.
BACKGROUND To look at possible long-term risks from anabolic steroids and other xenobiotics in beef, we examined men's semen quality in relation to their mother's self-reported beef consumption during pregnancy. METHODS: The study was carried out in five US cities between 1999 and 2005. We used regression analyses to examine semen parameters in 387 partners of pregnant women in relation to the amount of beef their mothers reported eating while pregnant. Mothers' beef consumption was also analysed in relation to the son's history of previous subfertility. RESULTS Sperm concentration was inversely related to mothers' beef meals per week (P = 0.041). In sons of "high beef consumers" (>7 beef meals/week), sperm concentration was 24.3% lower (P = 0.014) and the proportion of men with sperm concentration below 20 x 10(6)/ml was three times higher (17.7 versus 5.7%, P = 0.002) than in men whose mothers ate less beef. A history of previous subfertility was also more frequent among sons of "high beef consumers" (P = 0.015). Sperm concentration was not significantly related to mother's consumption of other meat or to the man's consumption of any meat. CONCLUSIONS These data suggest that maternal beef consumption, and possibly xenobiotics in beef, may alter a man's testicular development in utero and adversely affect his reproductive capacity. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3595 | null | Heavy Metals and Couple Fecundity, the LIFE Study
The effect of heavy metals at environmentally relevant concentrations on couple fecundity has received limited study despite ubiquitous exposure. In 2005–2009, couples (n=501) desiring pregnancy and discontinuing contraception were recruited and asked to complete interviews and to provide blood specimens for the quantification of cadmium (μg/L), lead (μg/dL) and mercury (μg/L) using inductively coupled plasma-mass spectrometry. Couples completed daily journals on lifestyle and intercourse along with menstruation and pregnancy testing for women. Couples were followed for 12 months or until pregnant. Fecundability odds ratios (FORs) and 95% confidence intervals (CIs) were estimated adjusting for age, body mass index, cotinine, and serum lipids in relation to female then male exposures. FORs <1 denote a longer time to pregnancy. In adjusted models, reduced FORs were observed for both female cadmium (0.78; 95% CI 0.63–0.97) and male lead (0.85; 95% CI 0.73–0.98) concentrations. When jointly modeling couples’ exposures, only male lead concentration significantly reduced the FOR (0.82; 95% CI 0.68, 0.97), though the FOR remained <1 for female cadmium (0.80; 95% CI 0.64, 1.00). This prospective couple based cohort with longitudinal capture of time to pregnancy is suggestive of cadmium and lead’s reproductive toxicity at environmentally relevant concentrations. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3596 | null | Physical activity, obesity and eating habits can influence assisted reproduction outcomes.
OBJECTIVE: To determine if eating habits, physical activity and BMI can influence assisted reproduction outcomes. MATERIAL AND METHODS: This study analyzed 436 patients undergoing intracytoplasmic sperm injection cycles. Patients answered a questionnaire and regression analysis examined the relationship between lifestyle and BMI with the intracytoplasmic sperm injection cycles outcomes. RESULTS: No influence of lifestyle and obesity was observed on the number of oocytes recovered. Obesity reduced the normal fertilization rate (coefficient [Coef.]: -16.0; p = 0.01) and increased the risk of miscarriage (OR: 14.3; p = 0.03). Physical activity positively affected implantation (Coef.: 9.4; p = 0.009), increased the chance of pregnancy (OR: 1.83; p = 0.013) and tended to decrease the risk of miscarriage (OR: 0.30; p = 0.068). In addition, an inverse correlation was found between physical activity and BMI, and a direct correlation was found between soft-drink consumption and BMI. CONCLUSIONS: Eating habits, physical activity and obesity could affect clinical outcomes of assisted reproduction. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3617 | null | High dietary antioxidant intakes are associated with decreased chromosome translocation frequency in airline pilots
Background: Dietary antioxidants may protect against DNA damage induced by endogenous and exogenous sources, including ionizing radiation (IR), but data from IR-exposed human populations are limited. Objective: The objective was to examine the association between the frequency of chromosome translocations, as a biomarker of cumulative DNA damage, and intakes of vitamins C and E and carotenoids in 82 male airline pilots. Design: Dietary intakes were estimated by using a self-administered semiquantitative food-frequency questionnaire. Translocations were scored by using fluorescence in situ hybridization with whole chromosome paints. Negative binomial regression was used to estimate rate ratios and 95% CIs, adjusted for potential confounders. Results: Significant and inverse associations were observed between translocation frequency and intakes of vitamin C, β-carotene, β-cryptoxanthin, and lutein-zeaxanthin from food (P < 0.05). Translocation frequency was not associated with the intake of vitamin E, α-carotene, or lycopene from food; total vitamin C or E from food and supplements; or vitamin C or E or multivitamin supplements. The adjusted rate ratios (95% CI) for ≥median compared with <median servings per week of high–vitamin C fruit and vegetables, citrus fruit, and green leafy vegetables were 0.61 (0.43, 0.86), 0.64 (0.46, 0.89), and 0.59 (0.43, 0.81), respectively. The strongest inverse association was observed for ≥median compared with <median combined intakes of vitamins C and E, β-carotene, β-cryptoxanthin, and lutein-zeaxanthin from food: 0.27 (0.14, 0.55). Conclusion: High combined intakes of vitamins C and E, β-carotene, β-cryptoxanthin, and lutein-zeaxanthin from food, or a diet high in their food sources, may protect against cumulative DNA damage in IR-exposed persons. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3618 | null | The use of dental radiographs: update and recommendations.
BACKGROUND AND OVERVIEW: The National Council on Radiation Protection & Measurements updated its recommendations on radiation protection in dentistry in 2003, the Centers for Disease Control and Prevention published its Guidelines for Infection Control in Dental Health-Care Settings in 2003, and the U.S. Food and Drug Administration updated its selection criteria for dental radiographs in 2004. This report summarizes the recommendations presented in these documents and addresses additional topics such as patient selection criteria, film selection for conventional radiographs, collimation, beam filtration, patient protective equipment, film holders, operator protection, film exposure and processing, infection control, quality assurance, image viewing, direct digital radiography and continuing education of dental health care workers who expose radiographs. CONCLUSIONS: This report discusses implementation of proper radiographic practices. In addition to these guidelines, dentists should be aware of, and comply with, applicable federal and state regulations. CLINICAL IMPLICATIONS: Dentists should weigh the benefits of dental radiographs against the consequences of increasing a patient's exposure to radiation and implement appropriate radiation control procedures. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3620 | null | Dietary factors and cancer mortality among atomic-bomb survivors.
Dietary factors such as fruit and vegetables are thought to reduce the risk of cancer incidence and mortality. We investigated the effect of a diet rich in fruit and vegetables against the long-term effects of radiation exposure on the risk of cancer. A cohort of 36,228 atomic-bomb survivors of Hiroshima and Nagasaki, for whom radiation dose estimates were currently available, had their diet assessed in 1980. They were followed for a period of 20 years for cancer mortality. The joint-effect of fruit and vegetables intake and radiation exposure on risk of cancer death was examined, in additive (sum of effects of diet alone and radiation alone) and multiplicative (product of effects of diet alone and radiation alone) models. In the additive model, a daily intake of fruit and vegetables significantly reduced the risk of cancer deaths by 13%, compared to an intake of once or less per week. Radiation exposure of 1 Sievert (Sv) increased significantly the risk of cancer death by 48-49%. The additive joint-effects showed a lower risk of cancer among those exposed to 1 Sv who had a diet rich in vegetables (49%-13%=36%) or fruit (48%-13%=35%). The multiplicative model gave similar results. The cancer risk reduction by vegetables in exposed persons went from 52% (effect of radiation alone) to 32% (product of effect of vegetables and radiation), and cancer risk reduction by fruit was 52% (radiation alone) to 34% (product of effect of fruit and radiation). There was no significant evidence to reject either the additive or the multiplicative model. A daily intake of fruit and vegetables was beneficial to the persons exposed to radiation in reducing their risks of cancer death. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3621 | null | Dental X-rays and Risk of Meningioma
Context Ionizing radiation is a consistently identified and potentially modifiable risk factor for meningioma, the most frequently reported primary brain tumor in the United States. Objective To examine the association between dental x-rays, the most common artificial source of ionizing radiation, and risk of intra-cranial meningioma. Design and Setting Population-based case-control study design. Participants The study includes 1433 intra-cranial meningioma cases aged 29-79 years diagnosed among residents of the states of Connecticut, Massachusetts, North Carolina, the San Francisco Bay Area and eight Houston, Texas counties between May 1, 2006 and April 28, 2011 and 1350 controls that were frequency-matched on age, sex and geography. Main Outcome Measure The association of intra-cranial meningioma diagnosis with self-report of bitewing, full-mouth, and panorex dental x-rays. Results Over a lifetime, cases were more than twice (Odds ratio (OR) = 2.0, 95% confidence interval (CI), 1.4-2.9) as likely as controls to report having ever had a bitewing exam. Regardless of the age at which the films were received, persons who reported receiving bitewing films on a yearly or greater frequency had an elevated risk with odds ratios of 1.4 (95%CI: 1.0-1.8), 1.6 (95%CI: 1.2-2.0), 1.9 (95%CI: 1.4-2.6), and 1.5 (95%CI: 1.1-2.0) for ages <10, 10-19, 20-49, and 50+ years, respectively. Increased risk of meningioma was also associated with panorex films taken at a young age or on a yearly or greater frequency with persons reporting receiving such films under the age of 10 years at 4.9 times (95%CI: 1.8-13.2) increased risk of meningioma. No association was appreciated with location of tumor above or below the tentorium. Conclusion Exposure to some dental x-rays performed in the past, when radiation exposure was greater than in the current era, appears to be associated with increased risk of intra-cranial meningioma. As with all sources of artificial ionizing radiation, considered use of this modifiable risk factor may be of benefit to patients. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3615 | null | Evaluation of chromosomal aberrations, micronuclei, and sister chromatid exchanges in hospital workers chronically exposed to ionizing radiation.
Cytogenetic analysis was performed in peripheral blood lymphocytes from hospital workers chronically exposed to ionizing radiation in comparison to matched non-exposed individuals. The accumulated absorbed doses calculated for the radiation workers ranged from 9.5 to 209.4 mSv. The endpoints used were chromosomal aberrations (CA), micronuclei (MN), and sister chromatid exchanges (SCE). The frequencies of CA/100 cells observed for the exposed group were significantly (P=0.018) higher than in the control group: 3.2 and 2.6, respectively. Similarly, the mean numbers of SCE per cell were statistically higher (P=0.025) in the exposed group (6.2) in comparison with the control group (5.8). In the case of micronuclei analysis, no significant (P=0,06) difference between both groups was found, but these data should be cautiously interpreted since an increase in the frequencies of MN was found for radiation workers (3.0 MN/100 cells), compared to the control group (2.6 MN/100 cells) and this increase occur in parallel to CA and SCE frequencies. The difference between the results could be explained by the nature of CA and MN generation. The increased frequencies of CA and SCE in radiation workers indicate the cumulative effect of low-level chronic exposure to ionizing radiation, and the relevance of conducting cytogenetic analysis in parallel to physical dosimetry in the working place. Copyright 2001 Wiley-Liss, Inc. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3622 | null | Dietary and clastogenic factors in children who immigrated to Israel from regions contaminated by the Chernobyl accident.
The authors evaluated the possible association between dietary history and plasma clastogenic factors in children who immigrated to Israel between 1989 and 1993 from regions contaminated by the Chernobyl accident. The authors compared questionnaire data about demographic variables, dietary histories before and after immigration occurred, and health status with clastogenic factor scores for 162 immigrants. Logistic regression analysis revealed a negative association between clastogenic factor scores and frequency of consumption of fresh vegetables and fruit among children < or = 7 yr of age during the postimmigration period. Intake of eggs and fish by boys who were < or = 7 yr of age prior to immigration was associated positively with clastogenic factor scores. Consumption of fresh vegetables and fruits afforded protection to the immune systems of children who were < or = 7 yr of age. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3628 | null | 210Po in the marine environment with emphasis on its behaviour within the biosphere.
The distribution and behaviour of the natural-series alpha-emitter polonium-210 in the marine environment has been under study for many years primarily due to its enhanced bioaccumulation, its strong affinity for binding with certain internal tissues, and its importance as a contributor to the natural radiation dose received by marine biota as well as humans consuming seafoods. Results from studies spanning nearly 5 decades show that (210)Po concentrations in organisms vary widely among the different phylogenic groups as well as between the different tissues of a given species. Such variation results in (210)Po concentration factors ranging from approximately 10(3) to over 10(6) depending upon the organism or tissue considered. (210)Po/(210)Pb ratios in marine species are generally greater than unity and tend to increase up the food chain indicating that (210)Po is preferentially taken up by organisms compared to its progenitor (210)Pb. The effective transfer of (210)Po up the food chain is primarily due to the high degree of assimilation of the radionuclide from ingested food and its subsequent strong retention in the organisms. In some cases this mechanism may lead to an apparent biomagnification of (210)Po at the higher trophic level. Various pelagic species release (210)Po and (210)Pb packaged in organic biodetrital particles that sink and remove these radionuclides from the upper water column, a biogeochemical process which, coupled with scavenging rates of this radionuclide pair, is being examined as a possible proxy for estimating downward organic carbon fluxes in the sea. Data related to preferential bioaccumulation in various organisms, their tissues, resultant radiation doses to these species, and the processes by which (210)Po is transferred and recycled through the food web are discussed. In addition, the main gaps in our present knowledge and proposed areas for future studies on the biogeochemical behaviour of (210)Po and its use as a tracer of oceanographic processes are highlighted in this review. Copyright © 2010 Elsevier Ltd. All rights reserved. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3629 | null | Pacific bluefin tuna transport Fukushima-derived radionuclides from Japan to California
The Fukushima Dai-ichi release of radionuclides into ocean waters caused significant local and global concern regarding the spread of radioactive material. We report unequivocal evidence that Pacific bluefin tuna, Thunnus orientalis, transported Fukushima-derived radionuclides across the entire North Pacific Ocean. We measured γ-emitting radionuclides in California-caught tunas and found 134Cs (4.0 ± 1.4 Bq kg−1) and elevated 137Cs (6.3 ± 1.5 Bq kg−1) in 15 Pacific bluefin tuna sampled in August 2011. We found no 134Cs and background concentrations (∼1 Bq kg−1) of 137Cs in pre-Fukushima bluefin and post-Fukushima yellowfin tunas, ruling out elevated radiocesium uptake before 2011 or in California waters post-Fukushima. These findings indicate that Pacific bluefin tuna can rapidly transport radionuclides from a point source in Japan to distant ecoregions and demonstrate the importance of migratory animals as transport vectors of radionuclides. Other large, highly migratory marine animals make extensive use of waters around Japan, and these animals may also be transport vectors of Fukushima-derived radionuclides to distant regions of the North and South Pacific Oceans. These results reveal tools to trace migration origin (using the presence of 134Cs) and potentially migration timing (using 134Cs:137Cs ratios) in highly migratory marine species in the Pacific Ocean. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3630 | null | Radioactive fallout in the United States due to the Fukushima nuclear plant accident.
The release of radioactivity into the atmosphere from the damaged Fukushima Daiichi nuclear power plant started on March 12th, 2011. Among the various radionuclides released, iodine -131 ((131)I) and cesium isotopes ((137)Cs and (134)Cs) were transported across the Pacific Ocean and reached the United States on 17-18 March 2011. Consequently, an elevated level of fission products (131)I, (132)I, (132)Te, (134)Cs and (137)Cs were detected in air, water, and milk samples collected across the United States between March 17 and April 4, 2011. The continuous monitoring of activities over a period of 25 days and spatial variations across more than 100 sampling locations in the United States made it possible to characterize the contaminated air masses. For the entire period, the highest detected activity values ranged from less than 1 m Bq m(-3) to 31 m Bq m(-3) for the particulate (131)I, and up to 96 m Bq m(-3) for the gaseous (131)I fraction. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3631 | null | Increase of 210Po levels in human semen fluid after mussel ingestion.
Polonium-210 ((210)Po) radioactive concentrations were determined in human semen fluid of vasectomized non-smoker volunteers. The (210)Po levels ranged from 0.10 to 0.39 mBq g(-1) (mean: 0.23 ± 0.08 mBq g(-1)). This value decreased to 0.10 ± 0.02 mBq g(-1) (range from 0.07 to 0.13 mBq g(-1)) after two weeks of a controlled diet, excluding fish and seafood. Then, volunteers ate during a single meal 200 g of the cooked mussel Perna perna L., and (210)Po levels were determined again, during ten days, in semen fluid samples collected every morning. Volunteers continued with the controlled diet and maintained sexual abstinence through the period of the experiment. A 300% increase of (210)Po level was observed the day following mussel consumption, with a later reduction, such that the level returned to near baseline by day 4. Copyright © 2011 Elsevier Ltd. All rights reserved. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3632 | null | An unexpected mortality increase in the United States follows arrival of the radioactive plume from Fukushima: is there a correlation?
The multiple nuclear meltdowns at the Fukushima plants beginning on March 11, 2011, are releasing large amounts of airborne radioactivity that has spread throughout Japan and to other nations; thus, studies of contamination and health hazards are merited. In the United States, Fukushima fallout arrived just six days after the earthquake, tsunami, and meltdowns. Some samples of radioactivity in precipitation, air, water, and milk, taken by the U.S. government, showed levels hundreds of times above normal; however, the small number of samples prohibits any credible analysis of temporal trends and spatial comparisons. U.S. health officials report weekly deaths by age in 122 cities, about 25 to 35 percent of the national total. Deaths rose 4.46 percent from 2010 to 2011 in the 14 weeks after the arrival of Japanese fallout, compared with a 2.34 percent increase in the prior 14 weeks. The number of infant deaths after Fukushima rose 1.80 percent, compared with a previous 8.37 percent decrease. Projecting these figures for the entire United States yields 13,983 total deaths and 822 infant deaths in excess of the expected. These preliminary data need to be followed up, especially in the light of similar preliminary U.S. mortality findings for the four months after Chernobyl fallout arrived in 1986, which approximated final figures. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3633 | null | Semen quality of male idiopathic infertile smokers and nonsmokers: an ultrastructural study.
This retrospective study was aimed at evaluating the effects of cigarette consumption on semen parameters in a group of men with idiopathic infertility. The semen quality of 2 groups of men with idiopathic infertility, smokers (n = 118) and nonsmokers (n = 153), were compared. Conventional semen analysis was performed and sperm morphology was assessed by transmission electron microscopy (TEM). TEM data were elaborated by means of a mathematical formula based on a Bayesian technique able to furnish a fertility index (FI), and the percentages of sperm apoptosis, necrosis, and immaturity. Values of normality recommended by World Health Organization guidelines were used as a control for conventional semen analysis, and values from sperm of 25 men of proven fertility were used for TEM indices. Infertile smoker and nonsmoker patients showed similar sperm parameters, although sperm motility and TEM analysis values in both groups were significantly impaired compared with controls. Smoker patients were then classified as mild (>or=1 and <or=10 cigarettes/d), moderate (>10 and <20 cigarettes/day), or heavy smokers (>or=20 cigarettes/d). Sperm concentration and FI were significantly (P < .05) different among the 3 considered smoker classes. Comparing the pairs of smoker classes, sperm concentration and FI in heavy smokers were significantly lower (P < .05) than that observed in mild smoker and nonsmoker groups. Although semen quality in males with idiopathic infertility seems not to be dramatically affected by cigarette consumption, heavy smokers show significantly lower sperm concentration and FI: another strong reason to stop smoking. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3634 | null | Cigarette smoke radioactivity and lung cancer risk.
INTRODUCTION: To determine the tobacco industry's policy and action with respect to radioactive polonium 210 ((210)Po) in cigarette smoke and to assess the long-term risk of lung cancer caused by alpha particle deposits in the lungs of regular smokers. METHODS: Analysis of major tobacco industries' internal secret documents on cigarette radioactivity made available online by the Master Settlement Agreement in 1998. RESULTS: The documents show that the industry was well aware of the presence of a radioactive substance in tobacco as early as 1959. Furthermore, the industry was not only cognizant of the potential "cancerous growth" in the lungs of regular smokers but also did quantitative radiobiological calculations to estimate the long-term (25 years) lung radiation absorption dose (rad) of ionizing alpha particles emitted from the cigarette smoke. Our own calculations of lung rad of alpha particles match closely the rad estimated by the industry. According to the Environmental Protection Agency, the industry's and our estimate of long-term lung rad of alpha particles causes 120-138 lung cancer deaths per year per 1,000 regular smokers. Acid wash was discovered in 1980 to be highly effectively in removing (210)Po from the tobacco leaves; however, the industry avoided its use for concerns that acid media would ionize nicotine converting it into a poorly absorbable form into the brain of smokers thus depriving them of the much sought after instant "nicotine kick" sensation. CONCLUSIONS: The evidence of lung cancer risk caused by cigarette smoke radioactivity is compelling enough to warrant its removal. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3635 | null | Wet deposition of fission-product isotopes to North America from the Fukushima Dai-ichi incident, March 2011.
Using the infrastructure of the National Atmospheric Deposition Program (NADP), numerous measurements of radionuclide wet deposition over North America were made for 167 NADP sites before and after the Fukushima Dai-ichi Nuclear Power Station incident of March 12, 2011. For the period from March 8 through April 5, 2011, wet-only precipitation samples were collected by NADP and analyzed for fission-product isotopes within whole-water and filterable solid samples by the United States Geological Survey using gamma spectrometry. Variable amounts of (131)I, (134)Cs, or (137)Cs were measured at approximately 21% of sampled NADP sites distributed widely across the contiguous United States and Alaska. Calculated 1- to 2-week individual radionuclide deposition fluxes ranged from 0.47 to 5100 Becquerels per square meter during the sampling period. Wet deposition activity was small compared to measured activity already present in U.S. soil. NADP networks responded to this complex disaster, and provided scientifically valid measurements that are comparable and complementary to other networks in North America and Europe. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3636 | null | Death by polonium-210: lessons learned from the murder of former Soviet spy Alexander Litvinenko.
The medical response to radiation--whether the result of radiological warfare, terrorist deployment of improvised radiation dispersal weapons, political assassination, occupational or industrial accidents or the medically radiated patient remains one of the least taught among all disciplines within medical education. In the aftermath of 9/11 among medical vulnerabilities to toxicant threats, of all the categories of weapons of mass destruction (WMD)--whether using the CBRNE (chemical, biological, radiological, nuclear, explosive) or NBC (nuclear, biological, chemical) acronym--radiation is the least taught in professional schools, responder cultures or civil preparedness organizations. To date, few health care professionals (HCP) possess the fundamental knowledge or skills to identify and diagnose, let alone treat a radiation victim; this vulnerability made even more obvious in the aftermath of the high profile assassination of former Russian agent Alexander Litvinenko. He was poisoned with Polonium210. Radioactive substances are ubiquitous with radiation sources being in or transported through virtually every region nationwide. It is essential to increase preparedness among community and rural health care facilities as well as urban and university hospitals. Managing radiation injuries effectively requires access to specialized equipment and expertise. Radiation sickness is progressive and may require acute, critical and long-term care throughout the course of illness. Regardless of the source, preparedness rests upon acknowledging a threat exists and dedicating the resources to address the risks including the enhancement of training and equipment. Mass or individual exposures to radiation present unique challenges to the entire response continuum from law enforcement, first responders and emergency medical care. Increased education about and practice in responding to radiological threats is essential to enhance preparedness. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3782 | null | Egg, red meat, and poultry intake and risk of lethal prostate cancer in the prostate specific antigen-era: incidence and survival
Red and processed meat may increase risk of advanced prostate cancer. Data on post-diagnostic diet and prostate cancer are sparse, but post-diagnostic intake of poultry with skin and eggs may increase risk of disease progression. Therefore, we prospectively examined total, unprocessed, and processed red meat, poultry, and eggs in relation to risk of lethal prostate cancer (e.g. men without cancer at baseline who developed distant organ metastases or died from prostate cancer during follow-up) among 27, 607 men followed from 1994–2008. We also performed a case-only survival analysis to examine post-diagnostic consumption of these foods and risk of lethal prostate cancer among the 3,127 men initially diagnosed with non-metastatic prostate cancer during follow-up. In the incidence analysis, we observed 199 events during 306,715 person-years. Men who consumed 2.5 or more eggs per week had an 81% increased risk of lethal prostate cancer compared to men who consumed less than 0.5 eggs per week (HR: 1.81; 95% confidence interval (CI): 1.13, 2.89; p-trend: 0.01). In the case-only survival analysis, we observed 123 events during 19,354 person-years. There were suggestive, but not statistically significant, positive associations between post-diagnostic poultry (HR ≥3.5 vs. <1.5 servings per week: 1.69; 95%CI: 0.96, 2.99; p-trend: 0.07) and post-diagnostic processed red meat (HR ≥3 vs. <0.5 servings per week: 1.45; 95%CI: 0.73, 2.87; p-trend: 0.08) and risk of progression of localized prostate cancer to lethal disease. In conclusion, consumption of eggs may increase risk of developing a lethal-form of prostate cancer among healthy men. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3787 | null | 11C-choline vs. 18F-FDG PET/CT in assessing bone involvement in patients with multiple myeloma
Background Multiple Myeloma (MM) is a B cell neoplasm causing lytic or osteopenic bone abnormalities. Whole body skeletal survey (WBSS), Magnetic resonance (MR) and 18F-FDG PET/CT are imaging techniques routinely used for the evaluation of bone involvement in MM patients. Aim As MM bone lesions may present low 18F-FDG uptake; the aim of this study was to assess the possible added value and limitations of 11C-Choline to that of 18F-FDG PET/CT in patients affected with MM. Methods Ten patients affected with MM underwent a standard 11C-Choline PET/CT and an 18F-FDG PET/CT within one week. The results of the two scans were compared in terms of number, sites and SUVmax of lesions. Results Four patients (40%) had a negative concordant 11C-Choline and 18F-FDG PET/CT scans. Two patients (20%) had a positive 11C-Choline and 18F-FDG PET/CT scans that identified the same number and sites of bone lesions. The remaining four patients (40%) had a positive 11C-Choline and 18F-FDG PET/CT scan, but the two exams identified different number of lesions. Choline showed a mean SUVmax of 5 while FDG showed a mean SUVmax of 3.8 (P = 0.042). Overall, 11C-Choline PET/CT scans detected 37 bone lesions and 18F-FDG PET/CT scans detected 22 bone lesions but the difference was not significant (P = 0.8). Conclusion According to these preliminary data, 11C-Choline PET/CT appears to be more sensitive than 18F-FDG PET/CT for the detection of bony myelomatous lesions. If these data are confirmed in larger series of patients, 11C-Choline may be considered a more appropriate functional imaging in association with MRI for MM bone staging. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3788 | null | Intestinal microbiota metabolism of L-carnitine, a nutrient in red meat, promotes atherosclerosis
Intestinal microbiota metabolism of choline/phosphatidylcholine produces trimethylamine (TMA), which is further metabolized to a proatherogenic species, trimethylamine-N-oxide (TMAO). Herein we demonstrate that intestinal microbiota metabolism of dietary L-carnitine, a trimethylamine abundant in red meat, also produces TMAO and accelerates atherosclerosis. Omnivorous subjects are shown to produce significantly more TMAO than vegans/vegetarians following ingestion of L-carnitine through a microbiota-dependent mechanism. Specific bacterial taxa in human feces are shown to associate with both plasma TMAO and dietary status. Plasma L-carnitine levels in subjects undergoing cardiac evaluation (n = 2,595) predict increased risks for both prevalent cardiovascular disease (CVD) and incident major adverse cardiac events (MI, stroke or death), but only among subjects with concurrently high TMAO levels. Chronic dietary L-carnitine supplementation in mice significantly altered cecal microbial composition, markedly enhanced synthesis of TMA/TMAO, and increased atherosclerosis, but not following suppression of intestinal microbiota. Dietary supplementation of TMAO, or either carnitine or choline in mice with intact intestinal microbiota, significantly reduced reverse cholesterol transport in vivo. Intestinal microbiota may thus participate in the well-established link between increased red meat consumption and CVD risk. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3790 | null | Intakes of meat, fish, poultry, and eggs and risk of prostate cancer progression
Background: Processed meat and fish have been shown to be associated with the risk of advanced prostate cancer, but few studies have examined diet after prostate cancer diagnosis and risk of its progression. Objective: We examined the association between postdiagnostic consumption of processed and unprocessed red meat, fish, poultry, and eggs and the risk of prostate cancer recurrence or progression. Design: We conducted a prospective study in 1294 men with prostate cancer, without recurrence or progression as of 2004–2005, who were participating in the Cancer of the Prostate Strategic Urologic Research Endeavor and who were followed for an average of 2 y. Results: We observed 127 events (prostate cancer death or metastases, elevated prostate-specific antigen concentration, or secondary treatment) during 2610 person-years. Intakes of processed and unprocessed red meat, fish, total poultry, and skinless poultry were not associated with prostate cancer recurrence or progression. Greater consumption of eggs and poultry with skin was associated with 2-fold increases in risk in a comparison of extreme quantiles: eggs [hazard ratio (HR): 2.02; 95% CI: 1.10, 3.72; P for trend = 0.05] and poultry with skin (HR: 2.26; 95% CI: 1.36, 3.76; P for trend = 0.003). An interaction was observed between prognostic risk at diagnosis and poultry. Men with high prognostic risk and a high poultry intake had a 4-fold increased risk of recurrence or progression compared with men with low/intermediate prognostic risk and a low poultry intake (P for interaction = 0.003). Conclusions: Our results suggest that the postdiagnostic consumption of processed or unprocessed red meat, fish, or skinless poultry is not associated with prostate cancer recurrence or progression, whereas consumption of eggs and poultry with skin may increase the risk. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3880 | null | Estimating changes in public health following implementation of hazard analysis and critical control point in the United States broiler slaughter i...
A common approach to reducing microbial contamination has been the implementation of a Hazard Analysis and Critical Control Point (HACCP) program to prevent or reduce contamination during production. One example is the Pathogen Reduction HACCP program implemented by the U.S. Department of Agriculture's Food Safety and Inspection Service (FSIS). This program consisted of a staged implementation between 1996 and 2000 to reduce microbial contamination on meat and poultry products. Of the commodities regulated by FSIS, one of the largest observed reductions was for Salmonella contamination on broiler chicken carcasses. Nevertheless, how this reduction might have influenced the total number of salmonellosis cases in the United States has not been assessed. This study incorporates information from public health surveillance and surveys of the poultry slaughter industry into a model that estimates the number of broiler-related salmonellosis cases through time. The model estimates that-following the 56% reduction in the proportion of contaminated broiler carcasses observed between 1995 and 2000-approximately 190,000 fewer annual salmonellosis cases (attributed to broilers) occurred in 2000 compared with 1995. The uncertainty bounds for this estimate range from approximately 37,000 to 500,000 illnesses. Estimated illnesses prevented, due to the more modest reduction in contamination of 13% between 2000 and 2007, were not statistically significant. An analysis relating the necessary magnitude of change in contamination required for detection via human surveillance also is provided. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-4132 | null | Ranking the disease burden of 14 pathogens in food sources in the United States using attribution data from outbreak investigations and expert elic...
Understanding the relative public health impact of major microbiological hazards across the food supply is critical for a risk-based national food safety system. This study was conducted to estimate the U.S. health burden of 14 major pathogens in 12 broad categories of food and to then rank the resulting 168 pathogen-food combinations. These pathogens examined were Campylobacter, Clostridium perfringens, Escherichia coli O157:H7, Listeria monocytogenes, norovirus, Salmonella enterica, Toxoplasma gondii, and all other FoodNet pathogens. The health burden associated with each pathogen was measured using new estimates of the cost of illness and loss of quality-adjusted life years (QALYs) from acute and chronic illness and mortality. A new method for attributing illness to foods was developed that relies on both outbreak data and expert elicitation. This method assumes that empirical data are generally preferable to expert judgment; thus, outbreak data were used for attribution except where evidence suggests that these data are considered not representative of food attribution. Based on evaluation of outbreak data, expert elicitation, and published scientific literature, outbreak-based attribution estimates for Campylobacter, Toxoplasma, Cryptosporidium, and Yersinia were determined not representative; therefore, expert-based attribution were included for these four pathogens. Sensitivity analyses were conducted to assess the effect of attribution data assumptions on rankings. Disease burden was concentrated among a relatively small number of pathogen-food combinations. The top 10 pairs were responsible for losses of over $8 billion and 36,000 QALYs, or more than 50 % of the total across all pairs. Across all 14 pathogens, poultry, pork, produce, and complex foods were responsible for nearly 60 % of the total cost of illness and loss of QALYs. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3882 | null | Characterization of extended-spectrum cephalosporin-resistant Salmonella enterica serovar Heidelberg isolated from food animals, retail meat, and h...
Salmonella enterica is one of the most common causes of foodborne illness in the United States. Although salmonellosis is usually self-limiting, severe infections typically require antimicrobial treatment, and ceftriaxone, an extended-spectrum cephalosporin (ESC), is commonly used in both adults and children. Surveillance conducted by the National Antimicrobial Resistance Monitoring System (NARMS) has shown a recent increase in ESC resistance among Salmonella Heidelberg isolated from food animals at slaughter, retail meat, and humans. ESC resistance among Salmonella in the United States is usually mediated by a plasmid-encoded bla(CMY) β-lactamase. In 2009, we identified 47 ESC-resistant bla(CMY)-positive Heidelberg isolates from humans (n=18), food animals at slaughter (n=16), and retail meats (n=13) associated with a spike in the prevalence of this serovar. Almost 90% (26/29) of the animal and meat isolates were isolated from chicken carcasses or retail chicken meat. We screened NARMS isolates for the presence of bla(CMY), determined whether the gene was plasmid-encoded, examined pulsed-field gel electrophoresis patterns to assess the genetic diversities of the isolates, and categorized the bla(CMY) plasmids by plasmid incompatibility groups and plasmid multi-locus sequence typing (pMLST). All 47 bla(CMY) genes were found to be plasmid encoded. Incompatibility/replicon typing demonstrated that 41 were IncI1 plasmids, 40 of which only conferred bla(CMY)-associated resistance. Six were IncA/C plasmids that carried additional resistance genes. pMLST of the IncI1-bla(CMY) plasmids showed that 27 (65.8%) were sequence type (ST) 12, the most common ST among bla(CMY)-IncI1 plasmids from Heidelberg isolated from humans. Ten plasmids had a new ST profile, ST66, a type very similar to ST12. This work showed that the 2009 increase in ESC resistance among Salmonella Heidelberg was caused mainly by the dissemination of bla(CMY) on IncI1 and IncA/C plasmids in a variety of genetic backgrounds, and is likely not the result of clonal expansion. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-4136 | null | Vital signs: incidence and trends of infection with pathogens transmitted commonly through food--foodborne diseases active surveillance network, 10...
BACKGROUND: In the United States, contaminated food causes approximately 1,000 reported disease outbreaks and an estimated 48 million illnesses, 128,000 METHODS: The Foodborne Diseases Active Surveillance Network (FoodNet) conducts surveillance among 15% of the U.S. population for laboratory-confirmed infections with nine pathogens transmitted commonly through food. Overall and pathogen-specific changes in incidence were estimated from 1996-1998 to 2010 and from 2006-2008 to 2010.hospitalizations, and 3,000 deaths annually. This report summarizes 2010 surveillance data and describes trends since 1996. RESULTS: A total of 19,089 infections, 4,247 hospitalizations, and 68 deaths were reported from FoodNet sites in 2010. Salmonella infection was the most common infection reported (17.6 illnesses per 100,000 persons) and was associated with the largest number of hospitalizations (2,290) and deaths (29); no significant change in incidence of Salmonella infection has occurred since the start of surveillance during 1996-1998. Shiga toxin-producing Escherichia coli (STEC) O157 infection caused 0.9 illnesses per 100,000. Compared with 1996-1998, overall incidence of infection with six key pathogens in 2010 was 23% lower, and pathogen-specific incidence was lower for Campylobacter, Listeria, STEC O157, Shigella, and Yersinia infection but higher for Vibrio infection. Compared with a more recent period, 2006--2008, incidence in 2010 was lower for STEC O157 and Shigella infection but higher for Vibrio infection. CONCLUSIONS: The incidence of STEC O157 infection has declined to reach the 2010 national health objective target of ≥1 case per 100,000. This success, as well as marked declines since 1996-1998 in overall incidence of six key foodborne infections, demonstrates the feasibility of preventing foodborne illnesses. IMPLICATIONS FOR PUBLIC HEALTH PRACTICE: Salmonella infection should be targeted because it has not declined significantly in more than a decade, and other data indicate that it is one of the most common foodborne infections, resulting in an estimated $365 million in direct medical costs annually. The prevention measures that reduced STEC O157 infection need to be applied more broadly to reduce Salmonella and other infections. Effective measures from farm to table include preventing contamination of meat during slaughter and of all foods, including produce, during processing and preparation; cooking meat thoroughly; vigorously detecting and investigating outbreaks; and recalling contaminated food. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3884 | null | The 2009 Garrod lecture: the evolution of antimicrobial resistance: a Darwinian perspective.
Microbes have evolved over 3.5 billion years and are arguably the most adaptable organisms on earth. Restricted genetically by their inability to reproduce sexually, bacteria have acquired several additional mechanisms by which to exchange genetic material horizontally. Such mechanisms have allowed bacteria to inhabit some of the most inhospitable environments on earth. It is thus hardly surprising that when faced with a barrage of inimical chemicals (antibiotics) they have responded with an equal and opposite force. This article compares and contrasts the evolution of antimicrobial resistance to beta-lactam antibiotics over the last 70 years in two bacterial species, namely Staphylococcus aureus, a highly evolved human pathogen, and Pseudomonas aeruginosa, an opportunistic nosocomial pathogen. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3885 | null | Mechanisms of antimicrobial resistance in bacteria.
The treatment of bacterial infections is increasingly complicated by the ability of bacteria to develop resistance to antimicrobial agents. Antimicrobial agents are often categorized according to their principal mechanism of action. Mechanisms include interference with cell wall synthesis (eg, beta-lactams and glycopeptide agents), inhibition of protein synthesis (macrolides and tetracyclines), interference with nucleic acid synthesis (fluoroquinolones and rifampin), inhibition of a metabolic pathway (trimethoprim-sulfamethoxazole), and disruption of bacterial membrane structure (polymyxins and daptomycin). Bacteria may be intrinsically resistant to > or =1 class of antimicrobial agents, or may acquire resistance by de novo mutation or via the acquisition of resistance genes from other organisms. Acquired resistance genes may enable a bacterium to produce enzymes that destroy the antibacterial drug, to express efflux systems that prevent the drug from reaching its intracellular target, to modify the drug's target site, or to produce an alternative metabolic pathway that bypasses the action of the drug. Acquisition of new genetic material by antimicrobial-susceptible bacteria from resistant strains of bacteria may occur through conjugation, transformation, or transduction, with transposons often facilitating the incorporation of the multiple resistance genes into the host's genome or plasmids. Use of antibacterial agents creates selective pressure for the emergence of resistant strains. Herein 3 case histories-one involving Escherichia coli resistance to third-generation cephalosporins, another focusing on the emergence of vancomycin-resistant Staphylococcus aureus, and a third detailing multidrug resistance in Pseudomonas aeruginosa-are reviewed to illustrate the varied ways in which resistant bacteria develop. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3886 | null | Mechanisms of antimicrobial resistance in bacteria.
The treatment of bacterial infections is increasingly complicated by the ability of bacteria to develop resistance to antimicrobial agents. Antimicrobial agents are often categorized according to their principal mechanism of action. Mechanisms include interference with cell wall synthesis (e.g., beta-lactams and glycopeptide agents), inhibition of protein synthesis (macrolides and tetracyclines), interference with nucleic acid synthesis (fluoroquinolones and rifampin), inhibition of a metabolic pathway (trimethoprim-sulfamethoxazole), and disruption of bacterial membrane structure (polymyxins and daptomycin). Bacteria may be intrinsically resistant to > or =1 class of antimicrobial agents, or may acquire resistance by de novo mutation or via the acquisition of resistance genes from other organisms. Acquired resistance genes may enable a bacterium to produce enzymes that destroy the antibacterial drug, to express efflux systems that prevent the drug from reaching its intracellular target, to modify the drug's target site, or to produce an alternative metabolic pathway that bypasses the action of the drug. Acquisition of new genetic material by antimicrobial-susceptible bacteria from resistant strains of bacteria may occur through conjugation, transformation, or transduction, with transposons often facilitating the incorporation of the multiple resistance genes into the host's genome or plasmids. Use of antibacterial agents creates selective pressure for the emergence of resistant strains. Herein 3 case histories-one involving Escherichia coli resistance to third-generation cephalosporins, another focusing on the emergence of vancomycin-resistant Staphylococcus aureus, and a third detailing multidrug resistance in Pseudomonas aeruginosa--are reviewed to illustrate the varied ways in which resistant bacteria develop. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3887 | null | Food Animals and Antimicrobials: Impacts on Human Health
Summary: Antimicrobials are valuable therapeutics whose efficacy is seriously compromised by the emergence and spread of antimicrobial resistance. The provision of antibiotics to food animals encompasses a wide variety of nontherapeutic purposes that include growth promotion. The concern over resistance emergence and spread to people by nontherapeutic use of antimicrobials has led to conflicted practices and opinions. Considerable evidence supported the removal of nontherapeutic antimicrobials (NTAs) in Europe, based on the “precautionary principle.” Still, concrete scientific evidence of the favorable versus unfavorable consequences of NTAs is not clear to all stakeholders. Substantial data show elevated antibiotic resistance in bacteria associated with animals fed NTAs and their food products. This resistance spreads to other animals and humans—directly by contact and indirectly via the food chain, water, air, and manured and sludge-fertilized soils. Modern genetic techniques are making advances in deciphering the ecological impact of NTAs, but modeling efforts are thwarted by deficits in key knowledge of microbial and antibiotic loads at each stage of the transmission chain. Still, the substantial and expanding volume of evidence reporting animal-to-human spread of resistant bacteria, including that arising from use of NTAs, supports eliminating NTA use in order to reduce the growing environmental load of resistance genes. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3888 | null | Salmonella enterica serotype Enteritidis: increasing incidence of domestically acquired infections.
BACKGROUND: Salmonella enterica causes an estimated 1 million cases of domestically acquired foodborne illness in humans annually in the United States; Enteritidis (SE) is the most common serotype. Public health authorities, regulatory agencies, food producers, and food processors need accurate information about rates and changes in SE infection to implement and evaluate evidence-based control policies and practices. METHODS: We analyzed the incidence of human SE infection during 1996-2009 in the Foodborne Diseases Active Surveillance Network (FoodNet), an active, population-based surveillance system for laboratory-confirmed infections. We compared FoodNet incidence with passively collected data from complementary surveillance systems and with rates of SE isolation from processed chickens and egg products; shell eggs are not routinely tested. We also compared molecular subtyping patterns of SE isolated from humans and chickens. RESULTS: Since the period 1996-1999, the incidence of human SE infection in FoodNet has increased by 44%. This change is mirrored in passive national surveillance data. The greatest relative increases were in young children, older adults, and FoodNet sites in the southern United States. The proportion of patients with SE infection who reported recent international travel has decreased in recent years, whereas the proportion of chickens from which SE was isolated has increased. Similar molecular subtypes of SE are commonly isolated from humans and chickens. CONCLUSIONS: Most SE infections in the United States are acquired from domestic sources, and the problem is growing. Chicken and eggs are likely major sources of SE. Continued close attention to surveillance data is needed to monitor the impact of recent regulatory control measures. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
nfcorpus-corpus-MED-3889 | null | Comparison of ESBL contamination in organic and conventional retail chicken meat.
Contamination of retail chicken meat by Extended Spectrum Beta-Lactamase (ESBL) producing bacteria likely contributes to the increasing incidence of infections with these bacteria in humans. This study aimed to compare the prevalence and load of ESBL positive isolates between organic and conventional retail chicken meat samples, and to compare the distribution of ESBL genes, strain genotypes and co-resistance. In 2010, 98 raw chicken breasts (n=60 conventional; n=38 organic) were collected from 12 local stores in the Netherlands. Prevalence of ESBL producing micro-organisms was 100% on conventional and 84% on organic samples (p<0.001). Median loads of ESBL producing micro-organisms were 80 (range <20-1360) in conventional, and <20 (range 0-260) CFU/25 g in organic samples (p=0.001). The distribution of ESBL genes in conventional samples and organic samples was 42% versus 56%, respectively (N.S.), for CTX-M-1, 20% versus 42% (N.S.) for TEM-52, and 23% versus 3% (p<0.001) for SHV-12. CTX-M-2 (7%), SHV-2 (5%) and TEM-20 (3%) were exclusively found in conventional samples. Co-resistance rates of ESBL positive isolates were not different between conventional and organic samples (co-trimoxazole 56%, ciprofloxacin 14%, and tobramycin 2%), except for tetracycline, 73% and 46%, respectively, p<0.001). Six of 14 conventional meat samples harbored 4 MLST types also reported in humans and 5 of 10 organic samples harbored 3 MLST types also reported in humans (2 ST10, 2 ST23, ST354). In conclusion, the majority of organic chicken meat samples were also contaminated with ESBL producing E. coli, and the ESBL genes and strain types were largely the same as in conventional meat samples. Copyright © 2011 Elsevier B.V. All rights reserved. |
nfcorpus-queries-PLAIN-2061 | null | seafood |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.