Cohort study indicates that selenium may be protective against advanced prostate cancer
by American Association for Cancer Research
A greater level of toenail selenium was associated with a significant decrease in the risk for advanced prostate cancer, according to data presented at the AACR Annual Meeting 2013, held in Washington, D.C., April 6-10.
"This could mean, based on our data and based on data from other studies, that selenium is a modifiable risk factor of advanced, clinically relevant prostate cancer," said Milan S. Geybels, M.Sc., a doctoral candidate in cancer epidemiology at Maastricht University, in Maastricht, the Netherlands.
The Netherlands Cohort Study on diet and cancer is a prospective cohort study that includes 58,279 men who were aged 55 to 69 years at entry in September 1986. Geybels and colleagues analyzed data from 898 men who were diagnosed with advanced prostate cancer during 17.3 years of follow-up of the cohort.
According to Geybels, previous studies investigating the association between selenium levels and prostate cancer have yielded varying results. One large clinical trial showed that selenium supplementation had no protective effect, while several prospective, observational studies indicated that higher levels of selenium were associated with a reduced prostate cancer risk, especially for advanced prostate cancer.
"Our study is interesting because we specifically investigated men with advanced prostate cancer, a type of prostate cancer associated with a poorer prognosis," Geybels said. "Also, while most of the prior research, including the large clinical trial, involved men with moderate-to-high selenium levels, men in The Netherlands Cohort Study have selenium levels that range from low to moderate. This is important because low selenium is expected to be related to a higher disease risk."
He and his colleagues chose toenail selenium as the study biomarker because it reflects long-term exposure, as opposed to blood, which is best for monitoring recent selenium exposures.
The data revealed that greater levels of toenail selenium were associated with a substantially reduced risk for advanced prostate cancer. Men with the highest toenail selenium levels had a more than 60 percent lower risk for advanced prostate cancer compared with men with the lowest toenail selenium levels.
"Our findings need to be replicated in further prospective studies, with an extended follow-up for the assessment of incident advanced prostate cancer, and with a wide range of toenail selenium that includes low selenium levels," Geybels said. "If our results can be confirmed, a prevention trial of selenium and prostate cancer in a low-selenium population may be justified."
Taking additional selenium will not reduce cancer risk
by Wiley
Although some people believe that taking selenium can reduce a person's risk of cancer, a Cochrane Systematic Review of randomised controlled clinical trials found no protective effect against non-melanoma skin cancer or prostate cancer. In addition, there is some indication that taking selenium over a long period of time could have toxic effects.
These conclusions were reached after researchers scanned the medical literature, looking for trials that studied the effects of taking selenium supplements and observational studies on selenium intake. The researchers located 49 prospective observational studies and six randomised controlled trials.
Looking at the data from observational studies gave some indication that people may be marginally protected from cancer if they had a higher selenium intake than those with a lower intake, and that the effect was slightly greater for men than women. "These conclusions have limitations because the data came from a wide variety of trials, and so it is difficult to summarise their findings," says lead researcher Dr Gabriele Dennert of the Institute for Transdisciplinary Health Research, Berlin, Germany, who coordinated the work of the international team of experts.
When the team of researchers looked at the more carefully conducted randomised controlled trials, any sign of benefit disappeared. "In fact, the results of the Nutritional Prevention of Cancer Trial and the Selenium and Vitamin E Cancer Prevention Trial raised concerns about possible harmful effects from long-term use of selenium supplements," says Dennert.
The researchers believe that there is a need for more research looking at selenium's effect on liver cancer and think that it would be worth investigating the possible gender differences that appear to be present in the uncontrolled studies.
"However, we could find no evidence to recommend regular intake of selenium supplements for cancer prevention in people whether or not they already have enough selenium," says Dennert.
Does selenium prevent cancer? It may depend on which form people take
by American Chemical Society
Scientists are reporting that the controversy surrounding whether selenium can fight cancer in humans might come down to which form of the essential micronutrient people take. It turns out that not all "seleniums" are the same — the researchers found that one type of selenium supplement may produce a possible cancer-preventing substance more efficiently than another form of selenium in human cancer cells. Their study appears in the ACS' journal Biochemistry.
Hugh Harris and colleagues note that although the Nutritional Prevention of Cancer clinical trial showed that selenium reduced the risk of cancer, a later study called the Selenium and Vitamin E Cancer Prevention Trial did not show a benefit. A major difference between the trials was the form of selenium that was used. To find out whether different types of selenium have different chemopreventive properties, the researchers studied how two forms—SeMet and MeSeCys—are processed in human lung cancer cells.
The researchers found that MeSeCys killed more lung cancer cells than SeMet did. Also, lung cancer cells treated with MeSeCys processed the selenium differently than than cells treated with SeMet. They say that these findings could explain why studies on the health benefits of selenium sometimes have conflicting results.
A study published in the December issue of Cancer Prevention Research, a journal of the American Association for Cancer Research, suggests that selenium, a trace mineral found in grains, nuts and meats, may aid in the prevention of high-risk bladder cancer.
Researchers from Dartmouth Medical School compared selenium levels in 767 individuals newly diagnosed with bladder cancer to the levels of 1,108 individuals from the general population. Findings showed an inverse association between selenium and bladder cancer among women, some smokers and those with p53 positive bladder cancer.
In the entire study population, there was no inverse association between selenium and bladder cancer, but women (34 percent), moderate smokers (39 percent) and those with p53 positive cancer (43 percent) had significant reductions in bladder cancer with higher rates of selenium.
"There are different pathways by which bladder cancer evolves and it is thought that one of the major pathways involves alterations in the p53 gene," said corresponding author Margaret Karagas, Ph.D., professor of community and family medicine of the Norris Cotton Cancer Center at Dartmouth. "Bladder cancers stemming from these alternations are associated with more advanced disease."
While other studies have shown a similar association between selenium and bladder cancer among women, this study is one of the first to show an association between selenium and p53 positive bladder cancer.
"Ultimately, if it is true that selenium can prevent a certain subset of individuals, like women, from developing bladder cancer, or prevent certain types of tumors, such as those evolving through the p53 pathway, from developing, it gives us clues about how the tumors could be prevented in the future and potentially lead to chemopreventive efforts," Karagas said.
Karagas hopes to replicate these findings on a larger scale in order to examine the connection between selenium and bladder cancer in women and those with p53 tumors, as well as with patient prognosis.
Selenium supplements could be harmful to people who already have enough selenium in their diet: study
by Lancet
Although additional selenium might benefit people who are lacking in this essential micronutrient, for those who already have enough selenium in their diet (including a large proportion of the USA population), taking selenium supplements could be harmful, and might increase the risk of developing type-2 diabetes, concludes a new review of the evidence published Online First in The Lancet.
"The intake of selenium varies hugely worldwide. Intakes are high in Venezuela, Canada, the USA, and Japan, but lower in Europe. Selenium-containing supplements add to these intakes, especially in the USA where 50% of the population takes dietary supplements", explains Margaret Rayman from the University of Surrey, Guilford, UK, author of the study.
Selenium is a natural occurring trace mineral found in soil and foods and is essential in small amounts for good health. Low selenium intake or status (levels in the blood) has been linked with an increased risk of death, poor immune function, and cognitive decline. Higher selenium intake or status has been shown to enhance male fertility, have antiviral effects, and provide some protection against cancers of the prostate, lung, colorectal system, and bladder.
But the evidence also suggests that selenium has a narrow therapeutic range and at high levels might have harmful effects such as increasing the risk of type-2 diabetes.
Over the last 10 years, the use of selenium supplements has become widespread, largely due to the belief that selenium can reduce the risk of cancer and other diseases. Selenium supplements have been marketed for a multitude of conditions largely based on the results of observational studies. However, findings from clinical trials to confirm their efficacy have been mixed.
The Review reveals that studies in different populations with different selenium status and genetic background have produced divergent results.
According to Rayman, these conflicting results can be explained by the fact that supplementation with selenium, as for many nutrients, is only beneficial when intake is inadequate.
She notes that the greatest benefit from selenium supplementation is likely to be in people with low blood selenium levels. However, to date, the largest trials have been done in countries where selenium status is good (like the USA), and more trials are needed in populations with low selenium status.
The Review also suggests that the interaction between selenium intake or status and genetic background could be important—people could be more or less genetically receptive to the benefits of selenium-containing proteins (selenoproteins) in the body or to selenium supplements: "Since polymorphisms in selenoproteins affect both selenium status and disease risk or prognosis, future studies must genotype participants."
Rayman concludes: "The crucial factor that needs to be emphasised is that people whose blood plasma selenium is already 122 µg/L or higher—a large proportion of the US population (the average level in American men is 134 µg/L)—should not take selenium supplements. However, there are various health benefits, and no extra risk, for people of lower selenium status (plasma level less than 122 µg/L), who could benefit from raising their status to 130-150 µg/L—a level associated with low mortality."*
Supplements are not nutritious
Selenium supplementation, for example in mineral tablets, might not be that beneficial for the majority of people according to researchers writing in the open access journal Genome Biology. Although this trace element is essential in the diet of humans, it seems that we have lost some of the need for selenium, which occurs in proteins and is transported in blood plasma, when our evolutionary ancestors left the oceans and evolved into mammals.
The research team including Alexey Lobanov and Vadim Gladyshev of the University of Nebraska-Lincoln and Dolph Hatfield of the National Institutes of Health conducted the genetic analysis. “Several trace elements are essential micronutrients for humans and animals but why some organisms use certain ones to a greater extent than others is not understood” comments Gladyshev. “We’ve found that the evolutionary change from fish to mammals was accompanied by a reduced use of proteins containing selenium.”
Selenium-containing proteins evolved in prehistoric times. Several human disorders have been associated with a deficiency in the trace element, among them are Keshan disease, a heart disorder affecting primarily children in certain provinces of China where the soil is deficient in selenium, and Myxedermatous Endemic Cretinism, a rare form of mongolism attributed to deficiencies in selenium and iodine found in certain areas of Africa. Selenium supplementation was thought to be necessary to prevent these and other diseases even in the areas with adequate selenium supply.
The evolved reduced reliance on selenium invites questions regarding the widely accepted use of supplements incorporating this trace element to maximize amounts of proteins that rely on it. Supplements are taken without knowing which groups of the population can benefit.
Interestingly, only 20% of lower organisms use selenium-based proteins, and, for example, fungi and vascular plants do not. Some insects have also lost the need for selenium during the course of evolution. Aquatic environments seem to favor an increased reliance on selenium because of environmental factors. Selenoprotein-rich Sea urchins, for instance, feed on algae, which themselves contain a lot of selenium.
Gladyshev concludes: “The evolved reduced utilization of selenium-containing proteins in mammals raises important questions in human and animal nutrition. Selenoprotein expression is regulated such that people don’t need to rely so heavily on dietary selenium which is often present in excess amounts in the diet. Individuals should consider their age, sex and medical needs before taking such supplements on a regular basis.”
What makes cobalt essential to life?
by John Hewitt , Phys.org
Credit: Wikipedia
Cobalt sits in the center of the corrin ring of vitamin B12 and the important cobalamins we derive from it. Perhaps surprisingly, only two of our enzymes bother to use these painfully constructed and meticulously channeled cofactors. Why do our cells go to such great lengths to get a little bit of the cobalt magic, and what catalytic properties might make it so special?
Other uncommon essential metals, like molybdenum, selenium and iodine, are similarly used only sparingly in cells, and yet we retain the ability to completely synthesize all the useful derivatives for these elements. To tame molybdenum, we construct an elaborate molybdopterin cofactor, while to harness iodine, we assemble thyroxine. To incorporate selenium into the few selenoproteins that require it, the elaborate SECIS machinery shuffles the mRNA code to attract a unique tRNA, upon which its cysteine cargo is transformed into selenocysteine. In each of these cases, researchers understand the special properties of the metals involved that make them indispensable.
For example, compared to sulfur, selenium is a better nucleophile that will react with reactive oxygen species faster, but its lack of π-bond character means that it can also be more readily reduced. Selenoproteins like GPX4 (glutathione peroxidase) are correspondingly more resistant to both overoxidation and irreversible inactivation. Similarly, the ineluctable requirement for molybdenum, a two-electron redox compound that can shuttle between the +4/+5 and the +5/+6 redox couples, reflects several not-so-common skills. It can perform diverse and energetically challenging redox reactions; it can act as an electron sink or source at low redox potential; and (along with the much rarer tungsten) can effectively transfer oxygen and sulfur atoms during reactions taking place at low potential.
A noteworthy attempt to divine the essential cobalt character was advanced in a recent commentary in PNAS by geochemist extraordinaire Michael Russell. Poised between Fe and Ni in the periodic table, Russell notes that "the element is particularly 'energy-dense' with paired electrons in the outer orbit. Its occurrence as a metal alloy in serpentinites with a variable valence extending from Co+ through to Co4+, its various spin states, and its contrasting conformations render it unique, with untold contributions to be made to electronics, catalysis and the emergence of life. Indeed, Co–Fe cooperation has just been investigated at the opposite end of the redox spectrum—the electrocatalysis of the O2 evolution reaction. Substitutions of Co are either unfeasible, as in metabolism and in some double-atom catalysis, or they lie in the somewhat remote future."
Russell's comments are in response to an earlier article by He et. al. who demonstrated that hydrothermal reduction of bicarbonate into long-chain hydrocarbons (≤24 carbons) is possible through the use of iron and cobalt metals. These findings potentially explain both the abiogenic origin of petroleum, and key events in life's emergence. Since remnants of the porphyrins and corrins that are critical to life can be found amidst petroleum deposits, a critical question becomes: Did life invent these molecules, or did they first use abiotic facsimiles of these molecules and only later evolve the concomitant ability to synthesize them for themselves?
The response to my question from Russell was that, in his opinion, life likely invented corrin-like coordination by way of a four-amino-acid-peptide glycine-glycine-histidine motif capable of entrapping the cobalt atom. Curiously, porphyrins, which house iron or copper in their centers, and chlorins, which do the same with magnesium, must be contracted into corrins to bind cobalt. This specificity seemingly comes despite the near identical atomic radii (around 125 pm) for the contiguous Fe, Co, Ni, Cu elemental lineup. In Russell's view, cobalt (and other transition metals) required at life's emergence were active in deposits of the mineral green rust, also known as fougerite, at alkaline hydrothermal vents. Cobalt corrinoid joined with iron-sulfur clusters form the heart of primitive acetyl coenzyme-A pathways of the acetogens and the methanogens lying at the bottom of our evolutionary tree. This Co(FeS) protein mediates the attachment or detachment of a methyl group to or from carbon monoxide or another entity involved in the biosynthesis of acetyl-CoA.
The form of vitamin B12 used by our methylmalonyl-CoA mutase enzyme located in mitochondria for fatty acid and amino acid breakdown is known as adenosylcobalamin (AdoCbl). The other cobalamin-utilizing enzyme, methionine synthase, acts in the cytosol and uses a methylcobalamin cofactor wherein the adenosyl group is replaced by a methyl group. Land plants and fungi neither synthesize or require cobalamin as they lack methylmalonyl-CoA mutase, and have different kind of methionine synthase that doesn't require B12. When these enzymes are not working properly, their precursor molecules can presumably build up to high levels, causing problems like demyelinating disease and pernicious anemia.
While cobalt's thermal stability and high energy density make it an ideal component for the cathodes of lithium batteries, its usefulness to life comes from its many other unique properties, some discovered, and some still yet to be found.
Magnetism at the root of enhanced 'green' catalysis
by MagnetoCat SL
Graphical abstract. Credit: DOI: 10.1021/acscatal.1c03135
The research group at MagnetoCat SL (Alicante, Spain) published a fundamental theoretical work on magnetism in heterogeneous catalysis in ACS Catalysis. The group, composed by Ph.D. student Miss Chiara Biz, Dr. Mauro Fianchini and Dr. Jose Gracia, laid out a complex and comprehensive theoretical treatment linking electronic spin, magnetism and heterogeneous catalysis. This treatment concerns the behavior of correlated electrons in solids and the quantum mechanical "tricks" they implement to avoid each other while balancing repulsions and attractions.
It is known that relativity and quantum mechanics enforce a quantized magnetic moment upon an electron known as spin. When many of these spins cooperate together in complex materials, several combinations and domains are possibly formed. The macroscopic resultant of these domains is trivially called magnetism.
Magnetism affects the catalytic properties of materials, according to the group at MagnetoCat.
However, the footprint of magnetism in heterogeneous catalysis has been somewhat disregarded by chemists for the longest time, so why look into it now?
The answer is simple: Because magnetism may lead to achieve "greener" and more sustainable chemistry in the years to come. The great enhancement brought by magnetism on catalysis paved the way to improved processes for hydrogen production and water splitting. Moreover, abundant magnetic metals, like iron, cobalt and nickel, may serve as an excellent replacement for heavier, rarer and more expensive metals (like platinum or gold) in catalytic structures. Environment and economy, hand in hand.
The group at MagnetoCat already made computational predictions during 2020 on the superior activity of platinum-metal alloys (where metal is iron, cobalt and nickel) compared to pure platinum in hydrogen fuel cells and they are positively sure that the experiments will soon confirm the predictions and pave the way to industrial implementation. The group is now working on the computationally-driven catalytic design of magnetic catalysts within the framework of the SPINCAT project.
Catalyst surface analysed at atomic resolution
by Ruhr-Universitaet-Bochum
Fig. 1: OER activity of Co2FeO4 and CoFe2O4 nanoparticles. a, b Linear sweep voltammetry (LSV) curves recorded at a scan rate of 10 mV/s in 1.0 M KOH on glassy carbon electrodes deposited with Co2FeO4 and CoFe2O4 nanoparticles in the pristine state and after 100, 500 and 1000 cycles of cyclic voltammetry (CV) measurements, c, d CVs of Co2FeO4 and CoFe2O4 after one, 100, 500 and 1000 cycles recorded at a scan rate of 50 mV/s in 1.0 M KOH under OER conditions, e, f Tafel slopes of Co2FeO4 and CoFe2O4 in the pristine and after 100, 500 and 1000 cycles, derived from the LSV curves in a, b. Source data are provided as a Source Data file. The error bars of Tafel slopes in e, f were measured by linear curve fitting. Credit: DOI: 10.1038/s41467-021-27788-2
Researchers from the Ruhr-Universität Bochum, the University of Duisburg-Essen and the Max Planck Institute for Chemical Energy Conversion in Mülheim an der Ruhr cooperated on the project as part of the Collaborative Research Centre "Heterogeneous oxidation catalysis in the liquid phase."
At RUB, a team headed by Weikai Xiang and Professor Tong Li from Atomic-scale Characterisation worked together with the Chair of Electrochemistry and Nanoscale Materials and the Chair of Industrial Chemistry. Institutes in Shanghai, China, and Didcot, UK, were also involved. The team presents their findings in the journal Nature Communications, published online on 10 January 2022.
Particles observed during the catalysis process
The researchers studied two different types of nanoparticles made of cobalt iron oxide that were around ten nanometres. They analyzed the particles during the catalysis of the so-called oxygen evolution reaction. This is a half reaction that occurs during water splitting for hydrogen production: hydrogen can be obtained by splitting water using electrical energy; hydrogen and oxygen are produced in the process. The bottleneck in the development of more efficient production processes is the partial reaction in which oxygen is formed, i.e. the oxygen evolution reaction. This reaction changes the catalyst surface that becomes inactive over time. The structural and compositional changes on the surface play a decisive role in the activity and stability of the electrocatalysts.
For small nanoparticles with a size around ten nanometres, achieving detailed information about what happens on the catalyst surface during the reaction remains a challenge. Using atom probe tomography, the group successfully visualized the distribution of the different types of atoms in the cobalt iron oxide catalysts in three dimensions. By combining it with other methods, they showed how the structure and composition of the surface changed during the catalysis process—and how this change affected the catalytic performance.
"Atom probe tomography has enormous potential to provide atomic insights into the compositional changes on the surface of catalyst nanoparticles during important catalytic reactions such as oxygen evolution reaction for hydrogen production or CO2 reduction," concludes Tong Li.
Smartphone app can vibrate a single drop of blood to determine how well it clots
by University of Washington
Researchers at the University of Washington have developed a new blood-clotting test that uses only a single drop of blood and a smartphone vibration motor and camera. The system includes a plastic attachment that holds a tiny cup of blood beneath the phone's camera (shown here). Note: This photo simulates how this system works, and the "blood" shown here is not real. Credit: Mark Stone/University of Washington
Blood clots form naturally as a way to stop bleeding when someone is injured. But blood clots in patients with medical issues, such as mechanical heart valves or other heart conditions, can lead to a stroke or heart attack. That's why millions of Americans take blood-thinning medications, such as warfarin, that make it harder for their blood to clot.
Warfarin isn't perfect, however, and requires patients to be tested frequently to make sure their blood is in the correct range—blood that clots too easily could still lead to a stroke or a heart attack while blood that doesn't clot can lead to extended bleeding after an injury. To be tested, patients either have to go to a clinic laboratory or use a costly at-home testing system.
Researchers at the University of Washington have developed a new blood-clotting test that uses only a single drop of blood and a smartphone vibration motor and camera. The system includes a plastic attachment that holds a tiny cup beneath the phone's camera.
A person adds a drop of blood to the cup, which contains a small copper particle and a chemical that starts the blood-clotting process. Then the phone's vibration motor shakes the cup while the camera monitors the movement of the particle, which slows down and then stops moving as the clot forms. The researchers showed that this method falls within the accuracy range of the standard instruments of the field.
The team published these findings Feb. 11 in Nature Communications.
"Back in the day, doctors used to manually rock tubes of blood back and forth to monitor how long it took a clot to form. This, however, requires a lot of blood, making it infeasible to use in home settings," said senior author Shyam Gollakota, UW professor in the Paul G. Allen School of Computer Science & Engineering. "The creative leap we make here is that we're showing that by using the vibration motor on a smartphone, our algorithms can do the same thing, except with a single drop of blood. And we get accuracy similar to the best commercially available techniques."
Researchers at the University of Washington have developed a new blood-clotting test that uses only a single drop of blood and a smartphone vibration motor and camera. The system includes a plastic attachment that holds a tiny cup of blood beneath the phone's camera. A person adds blood to the cup, which contains a chemical that starts the blood clotting process and a small copper particle (shown here as the oblong blue shape in the top right of the red circle). Note: This photo simulates how this system works, and the "blood" shown here is not real. Credit: Mark Stone/University of Washington
Doctors can rank blood-clotting ability using two numbers:
the time it takes for the clot to form, what's known as the "prothrombin time" or PT
a ratio calculated from the PT that allows doctors to more easily compare results between different tests or laboratories, called the "international normalized ratio" or INR
"Most people taking this medication are taking it for life. But this is not a set-and-forget type of thing—in the U.S., most people are only in what we call the 'desirable range' of PT/INR levels about 64% of the time," said co-author Dr. Kelly Michaelsen, assistant professor of anesthesiology and pain medicine in the UW School of Medicine. "This number is even lower—only about 40% of the time—in countries such as India or Uganda where there is less frequent testing. How can we make this better? We need to make it easier for people to test more frequently and take ownership of their health care."
Patients who can monitor their PT/INR levels from home would only need to go to see a clinician if the test suggested they were outside of that desirable range, Michaelsen said.
The researchers wanted an inexpensive device that could work similarly to how at-home blood sugar monitors work for people with diabetes: A person can prick their finger and test a drop of blood.
"We started by vibrating a single drop of blood and trying to monitor waves on the surface," said lead author Justin Chan, a UW doctoral student in the Allen School. "But that was really challenging with such a small amount of blood."
The team added a small copper particle because its motion was so much more reliable to track.
"As the blood clots, it forms a network that tightens. And in that process, the particle goes from happily bouncing around to no longer moving," Michaelsen said.
Researchers at the University of Washington have developed a new blood-clotting test that uses only a single drop of blood and a smartphone vibration motor and camera. Shown here are lead author Justin Chan (left, holding the device), a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering, and co-author Dr. Kelly Michaelsen (right), assistant professor of anesthesiology and pain medicine in the UW School of Medicine. Credit: Mark Stone/University of Washington
To calculate PT and INR, the phone collects two time stamps: first when the user inserts the blood and second when the particle stops moving.
"For the first time stamp, we're looking for when the user inserts a capillary tube containing the sample in the frame," Chan said. "For the end of the measurement, we look directly at the interior of the cup so that the only movement within those frames is the copper particle. The particle stops moving abruptly because blood clots very quickly, and you can observe that difference between frames. From there we can calculate the PT, and this can be mapped to INR."
The researchers tested this method on three different types of blood samples. As a proof of concept, the team started with plasma, a component of blood that is transparent and therefore easier to test. The researchers tested plasma from 140 anonymized patients at the University of Washington Medical Center. The team also examined plasma from 79 patients with known blood-clotting issues. For both these conditions, the test had results that were similar to commercially available tests.
To mimic what a patient at home would experience, the team then tested whole blood from 80 anonymized patients at both Harborview and the University of Washington medical centers. This test also yielded results that were in the accuracy range of commercial tests.
This device is still in a proof-of-concept stage. The researchers have publicly released the code and are exploring commercialization opportunities as well as further testing. For example, currently all these tests have been done in the lab. The next step is to work with patients to test this system at home. The researchers also want to see how the system fares in more resource-limited areas and countries.
"Almost every smartphone from the past decade has a vibration motor and a camera. This means that almost everyone who has a phone can use this. All you need is a simple plastic attachment, no additional electronics of any kind," Gollakota said. "This is the best of all worlds—it's basically the holy grail of PT/INR testing. It makes it frugal and accessible to millions of people, even where resources are very limited."
Additional co-authors on this paper are Joanne Estergreen, clinical laboratory supervisor in the UW School of Medicine's laboratory medicine and pathology department, and Dr. Daniel Sabath, professor of laboratory medicine and pathology in the UW School of Medicine.
Study: Nearly 1 in 7 COVID patients in ICU experienced severe bleeding when given full-dose blood thinners
by Marcene Robinson, University at Buffalo
Credit: Wikimedia Commons
Patients with COVID-19 in the intensive care unit (ICU) prescribed full-dose blood thinners are significantly more likely to experience heavy bleeding than patients prescribed a smaller yet equally effective dose, according to a recent University at Buffalo-led study.
The research, which compared the safety and effectiveness of blood clot treatment strategies for more than 150 critically ill COVID-19 patients at two hospitals, found that almost all patients who experienced significant bleeding were mechanically ventilated and receiving full-dose anticoagulants (blood thinners).
The results, published last month in Hospital Pharmacy, may inform treatment guidelines for blood clots in hospitalized COVID-19 patients, who are at an increased risk for both blood clots and severe bleeding. Previous reports have found that 17% of hospitalized COVID-19 patients experience blood clots, says first author Maya Chilbert, PharmD, clinical assistant professor in the UB School of Pharmacy and Pharmaceutical Sciences.
"A wide variety of practice exists when it comes to approaching blood clots in hospitalized patients with COVID-19, and there is little data to suggest improved outcomes using one strategy versus another," says Chilbert. "Caution should be used in mechanically ventilated patients with COVID-19 when selecting a regimen to treat blood clots, and the decision to use full-dose blood thinners should be based on a compelling indication rather than lab markers alone."
Additional investigators in the UB School of Pharmacy and Pharmaceutical Sciences include Collin Clark, PharmD, clinical assistant professor, and Ashley Woodruff, PharmD, clinical associate professor. The research was also conducted by investigators at the Buffalo General Medical Center, Millard Fillmore Suburban Hospital and Erie County Medical Center.
The study analyzed the outcome of blood clot treatments and the rate of bleeding events for more than 150 patients with COVID-19 who received either of two blood thinner regimens: a full-dose based on patient levels of D-dimer (a protein present in the blood after a blood clot dissolves), and the other a smaller but higher-than-standard dosage.
The average patient age was 58, and all experienced elevated levels of D-dimer, fibrinogen (a protein that helps the body form blood clots), and prothrombin time (a test that measures the time it takes for blood plasma to clot).
Nearly 14% of patients who received full-dose blood thinners experienced a significant bleeding event, compared to only 3% of patients who received a higher-than-standard dosage. All patients who experienced bleeding events were mechanically ventilated. No difference was reported in the regimens' effectiveness at treating blood clots.
Further investigation is needed to determine the optimal strategy for treating blood clots and bleeding in hospitalized COVID-19 patients, says Chilbert.
Benefits of high-dose blood thinners in COVID-19 patients remain unclear
by George Washington University
Credit: Unsplash/CC0 Public Domain
While COVID-19 infected patients should be treated with standard anticoagulation therapies, such as blood thinning medication, a new study by researchers at the George Washington University (GW) shows that anticoagulating patients at higher doses, without traditional medical indications to do so, may be ineffective and even harmful. The study was published in the journal Thrombosis Research.
"COVID-19 patients appear to have an increased incidence of blood clots. Many hospitals and health care providers began to use high doses of blood thinners to prevent these clots or treat them preemptively," said Juan Reyes, MD, MPH, co-author of the study and director of hospital medicine at the GW School of Medicine and Health Sciences (SMHS). "We wanted to review the data of our subset of COVID-19 patients treated with blood thinners to determine if the higher-dose medication was helpful."
The research team evaluated 402 patients diagnosed with COVID-19 admitted to GW Hospital between March 15 and May 31, 2020. Clinical outcomes were compared between 152 patients treated with high-dose blood thinners, and 250 patients treated with the standard low-dose blood thinners, typically prescribed to hospitalized medical patients. "Our findings did not demonstrate any additional benefit for those treated with higher doses of blood thinners above and beyond the standard of care," said Shant Ayanian, MD, MS, co-author and assistant professor of medicine at GW SMHS.
"While it's true that COVID-19 patients could be dying of blood clots, the data we've evaluated does not support giving every patient a high dose of blood thinners. That's a challenge, as the benefits still remain unclear," said Lei Lynn, MD, first author of the study and assistant professor of medicine at GW SMHS. "We caution against an aggressive blood thinner regimen for everyone, unless there is clear evidence to do so."
At the beginning of the pandemic, all patients admitted with COVID-19 to the GW Hospital were treated with standard dose anticoagulation, unless contraindicated. As awareness of the elevated risk of blood clots developed, many providers began treating patients with high-dose blood thinners. At GW Hospital, for non-critically ill patients, medical teams were advised to especially consider initiating a high dose of anticoagulation if a patient's D-dimer level exceeded 3 micrograms per milliliter. The research team previously published a study finding higher levels of the biomarker D-dimer, a medical indicator found in the blood, is associated with higher odds of clinical deterioration and death from COVID-19. This study is the first of its kind to utilize D-dimer levels to analyze clinical outcomes of anticoagulation in patients who are not critically ill.
"Though we would have loved to have seen a clinical benefit to our patients from anticoagulation, our research found that higher doses of blood thinners were potentially harmful, with no clear benefits," said Karolyn Teufel, MD, co-author of the study and assistant professor of medicine at GW SMHS. "Our research highlights the challenges with treating COVID-19. So much remains unknown."
"The effect of anticoagulation on clinical outcomes in novel Coronavirus (COVID-19) pneumonia in a U.S. cohort" was published in Thrombosis Research.
Blood thinners as a treatment for COVID-19? What the science says and what it means for you
by Karlheinz Peter, Hannah Stevens and James McFadyen, The Conversation
Credit: Shutterstock
A spate of recent media headlines have described blood thinning medications—which include aspirin and warfarin—as a "breakthrough treatment" for COVID-19 that could "save lives".
It's early days yet but a growing body of research evidence suggests COVID-19 causes abnormalities in blood clotting, which means blood thinning drugs may have a role to play in treatment.
Here's what the research says on this question—and how it applies to you.
Mounting evidence
When COVID-19 first emerged, it was thought the illness was a typical respiratory disease causing symptoms such as fever, sore throat, dry cough, and potentially lung infection (pneumonia) and a build-up of fluid in the lungs making it difficult to breathe.
However, as we outlined in a previous article in The Conversation, 30-70% of COVID-19 patients admitted to intensive care units, developed blood clots.
These rates of blood clotting appear to be much higher than what is expected when compared with people who are hospitalised for reasons other than COVID-19.
Blood clots in the veins often present in the legs (deep vein thrombosis) and are dislodged into the lungs (called pulmonary embolism); approximately one in four COVID-19 patients admitted to intensive care will develop a pulmonary embolism (where an artery in the lungs gets blocked).
Arterial blood clots associated with COVID-19 can lead to strokes, including in younger patients, with potentially devastating outcomes.
In addition, COVID-19 appears to cause tiny blood clots that can block small vessels in the lungs. These "micro" blood clots may be a key reason why patients with COVID-19 often have very low oxygen levels.
Blood clots appear to be associated with a higher risk of dying from COVID-19. Likewise, elevated markers of blood clotting are associated with an increased risk of admission to the intensive care unit and a worse prognosis overall.
Should blood thinners be standard treatment for COVID-19 patients in hospital?
Because the rate of blood clotting is so high, all people admitted to hospital with COVID-19 should receive a low dose of blood thinner medication to prevent blood clots. This prophylactic dose of blood thinner is standard across most hospitals in Australia.
However, many blood clots in COVID-19 are occurring despite the use of low-dose blood thinners. As such, it is a question of intense discussion whether people admitted to hospital with severe COVID-19 should receive a higher-than-usual dose of blood thinners to prevent blood clots and improve clinical outcomes.
A recent study from the US suggests patients admitted to hospital and prescribed full dose blood thinners had a better chance of survival and lower chance of needing a ventilator.
However, this finding has to be confirmed before the higher dose can be generally recommended. Fortunately, several research studies are underway in Europe, the UK and elsewhere to test and answer this question definitively.
Several other blood thinner treatments are also being evaluated in people with COVID-19. Aspirin is commonly prescribed to people who are at high risk of strokes or heart attacks. There are now studies underway examining if aspirin can reduce risk of blood clotting in people with COVID-19. In the US, some stronger clot-busting medications are also being trialled in people with severe COVID-19.
It is important to note blood thinners are not without risk, as this treatment can increase the risk of bleeding. So without definite evidence to support the benefit of high dose blood thinners in all hospitalized patients with COVID-19, the decision to use higher doses of blood thinning medication outside of a clinical trial must be made on an individual basis.
Should I take an aspirin to prevent blood clots?
There is no evidence aspirin or other blood thinners should be taken to prevent blood clots in the general population. Also, there is no evidence blood thinners are required to prevent blood clots for people with mild COVID-19 who are isolating at home. Because blood thinners can cause bleeding, they should not be taken unless prescribed by a doctor.
It is important for people who are taking blood thinners for another reason to continue taking these medications as normal, particularly if they are diagnosed with COVID-19.
In summary, our understanding of COVID-19 and how the coronavirus attacks the body continues to rapidly evolve. Researchers from around the world are publishing data almost daily. However, not all of this research has been peer reviewed.
If you develop symptoms, the most important thing you can do is to get tested for COVID-19 and talk to your doctor about potential treatments, including hospital admission and then about blood thinning medication.
Similar to our colleagues in the UK and the US, we as doctors specialized in the field of blood clotting are indeed optimistic and hope clinical studies currently underway will show rigorous strategies for prevention and treatment of blood clotting will help to reduce severity and improve survival of patients with COVID-19.
Genomic surveillance helps prepare for rabies outbreaks in bats
by Anna Zarra Aldrich, University of Connecticut
A big brown bat at BatFest in Hannibal, Missouri. Credit: Courtney Celley/USFWS
The COVID-19 pandemic and the significant impact of emerging variants have shown the importance of understanding viruses in as much detail as possible.
Luckily there are many ways scientists can learn more about zoonotic diseases, like genetic sequencing. Data gathered from detailed genetic sequencing can tell us a lot about a virus, like how it spreads and its evolutionary history.
This same lesson can be applied to another public health concern—rabies in bats.
Dong-Hun Lee, assistant professor of infectious disease epidemiology in the College of Agriculture, Health and Natural Resources recently identified the currently circulating rabies viruses in bats submitted to the Connecticut Veterinary Medical Diagnostic Laboratory (CVMDL) and shared his findings in Viruses.
This study is the first to collect genetic sequencing data from bat rabies viruses in the Northeast.
Bats are the most frequently reported rabid wildlife species in the United States. According to the CDC, bats are responsible for 70% of rabies deaths among people who are infected with the rabies virus in the United States during 1960–2018.
Rabies vaccine can prevent rabies. If an unvaccinated person does not receive appropriate medical care after an exposure, human rabies is almost 100% fatal. In 2021, there were five rabies cases in the United States, three of which were deaths.
"Rabies in bats should be a major public health consideration in the United States," says Lee, who is a member of the CVMDL and the Department of Pathobiology and Veterinary Science.
Between 2018 and 2019, Lee and Risatti tested 88 bats brought for rabies testing at the CVMDL by community members and local animal control units. These bats did not bite a human; those cases are handled by the Connecticut Department of Public Health.
Of those 88 bats, six tested positive for rabies. Lee and Risatti then sequenced the complete genomes of the rabies viruses.
Guillermo Risatti, professor of pathobiology and director of the CVMDL, is co-author on the paper.
"We are seeing these bats testing positive for rabies," Risatti says. "The next step is saying 'okay, what type of viruses are they carrying?'"
They identified four unique viruses: three in non-migratory big brown bats and one in a migratory hoary bat. Two samples did not have enough material to sequence.
The viruses in the big brown bats were closest to viruses found in Pennsylvania, New York, and New Jersey. The virus in the hoary bat, however, bore a close genetic relatedness to viruses in Arizona, Washington, Idaho, and Tennessee.
This information emphasizes how migration patterns can move viruses around and potentially allow them to establish in local populations.
The CVMDL is the only laboratory in the state studying rabies in bats in hopes of supporting disease tracking and disease preparedness.
"It's really important to be ready for future outbreaks," Lee says. "For epidemiological investigations, we need baseline data. It would be nice to have a baseline before we have an issue."
Prior to this research, the most recent whole-genome rabies study in bats was from Tennessee in 2004.
These findings will serve as baseline data for a potential future rabies outbreak. Having this level of detailed genetic information will allow scientists to understand how the virus spread and if existing vaccines will be effective.
"In case you have some kind of expansion of a virus, it's good to know what the viruses look like and what gives them some kind of competitive advantage—that's why they can expand in a population," Risatti says. "And at the end of the day, it's to know if the vaccines are still useful."
Risatti and Lee say they intend to continue this genomic surveillance on the CVMDL's bat samples to contribute to the body of genetic information available on rabies viruses. They may also look at other viruses bats could be carrying, such as coronaviruses.
Mayo Clinic Minute: Bats can be a rabies threat
by Jason Howland
Credit: CC0 Public Domain
October is Bat Month and a good time to make sure they are not roosting in your home. The Humane Society of the United States has tips to evict them so they don't hibernate in your home for the winter.
Bats play an important role in many ecosystems around the world. They are a major predator of night-flying insects, including pests that cost farmers billions of dollars annually. However, bats pose the biggest rabies threat in the U.S., according to the Centers for Disease Control and Prevention. Most bats are not rabid. However, because rabies can only be determined by laboratory testing, there is concern about possible exposure.
The most dangerous threat of rabies in the U.S. is flying overhead.
"It used to be thought, well, it's a rabid dog. But the more common way of getting rabies is from the silver-haired bat," says Dr. Gregory Poland, director of the Mayo Clinic Vaccine Research Group.
The deadly virus is transmitted from the saliva of infected animals to humans, usually through a bite. However, bats don't always bite.
"Sometimes the saliva will drool onto you, and you could have a minor open cut," says Dr. Poland. "Or sometimes a bat will lick on the skin and, again, transmit the virus that way."
Dr. Poland says that's why if you wake up and find a bat in the room, you should get the rabies vaccine.
"People think: 'Well, the bat's in the house. We woke up with it, doesn't look like it bit anybody.' Doesn't matter. Rabies is such a severe disease with no cure, no treatment for it, that the safer thing to do is give the rabies vaccine," says Dr. Poland.
That includes an immune globulin and multidose rabies series, which is not cheap. A typical series of rabies vaccines cost anywhere from $3,000 to $7,000.
Infected bats pose highest rabies risk in US: CDC
Infected bats are the leading cause of rabies deaths in the United States, according to a report released Wednesday by health authorities which found the risk posed by dogs had significantly fallen.
Bats were responsible for 70 percent of rabies deaths in the US between 1960 and 2018, a striking figure because they account for just a third of the 5,000 rabid animals reported each year.
But the overall risk remains very low: of the approximately 59,000 rabies deaths worldwide every year, only two occur in the United States.
"Bats play a critical role in our ecosystem and it is important people know that most of the bats in the US are not rabid," said Emily Pieracci, the lead author of the report for the Centers for Disease Control and Prevention (CDC).
"The problem comes when people try to handle bats they think are healthy because you really can't tell if an animal has rabies just by looking at it.
"The best advice is to avoid contact with bats—and other wildlife—to protect yourself from rabies."
The CDC said people may be underestimating the risk posed by the winged mammals and failing to recognize a bat scratch or bite, which can be smaller than the top of a pencil eraser, but this contact could still spread rabies.
On the other hand, people often worry about squirrels and opossum, but these animals generally don't carry the virus.
The report found that domestic dogs present much less of a risk than in the past thanks to the routine use of pet vaccines and the availability of post-exposure prophylaxis (PEP), which combines the rabies vaccine and rabies immune globulin to prevent infection after exposure to the virus.
Each year about 55,000 Americans get PEP treatment after being bitten or scratched by an infected or suspected infected animal.
However, exposure to rabid dogs while overseas was found to be the second-leading cause of rabies in Americans, and the CDC encouraged travelers to research the risks of their destination before traveling and even consider pre-exposure vaccines.
The rabies virus is transmitted through saliva, most commonly from the bite or scratch of an infected animal.
But the virus won't go on to cause the disease if people seek prompt treatment with PEP before the onset of symptoms, which can include increased aggression, fever, excess salivation and partial paralysis.
There is no treatment once signs of the disease begin, and it is fatal in humans and animals within one to two weeks.
Norway lifts final COVID curbs on social distancing
Norway on Saturday lifted its final COVID restrictions, scrapping social distancing and masks in crowded spaces despite a surge in Omicron infections.
"The metre is disappearing. We are taking away the recommendation on social distancing," Prime Minister Jonas Gahr Store told reporters at a press conference.
"Now we can now socialise like we did before, in nightlife, at cultural events and other social occasions. And on the way to and from work on buses, trains and ferries," he said.
Norway lifted most of its other COVID curbs earlier this month, including remote working, crowd size limits and restricted alcohol sales in bars and restaurants.
The requirement to isolate for four days after a positive COVID test was meanwhile on Saturday downgraded to a recommendation, and children with respiratory symptoms are no longer required to get tested for COVID.
Gahr Store stressed however that "the pandemic is not over", and advised unvaccinated people and those in risk groups to continue practising social distancing and wear masks where social distancing is not possible.
The Norwegian Institute of Public Health (FHI) said the country had yet to see the peak of the Omicron surge, but it was expected soon.
The agency's director Camilla Stoltenberg told reporters the number of COVID hospitalisations had risen by 40 percent in the past week.
As of Friday, 986,851 cases and 1,440 virus-related deaths had been recorded in Norway, where more than 91 percent of the population has received at least two doses of the vaccine.
FHI estimates that three to four million people from a population of 5.4 million may be infected by this summer.
Norway scraps most COVID curbs despite Omicron surge
Credit: Pixabay/CC0 Public Domain
Norway on Tuesday announced it would scrap most of its COVID restrictions despite an Omicron-fuelled surge in infections, saying society must "live with" the virus.
The Omicron variant has caused case numbers to soar in Norway, but hospital admissions for severely ill COVID patients have not increased in a population with wide vaccine coverage.
"We have finally reached the point where we can lift lots of the health measures we have had to live with this winter," Prime Minister Jonas Gahr Store told a press conference.
"We are going to have to live with a high level of infections—we can live with a high level of infections," he added.
Norwegians will not have to quarantine if they are a contact of an infected person, although daily testing is recommended for five days, and the isolation period for COVID cases will fall from six to four days.
Remote working will no longer be obligatory and an unlimited number of people can visit other households and attend sporting events.
Travellers entering the Scandinavian nation will no longer have to undergo testing.
Restrictions on alcohol sales in bars and restaurants will also end.
The measures come into effect at 11 pm (2200 GMT) Tuesday.
Masks will remain mandatory in settings like public transport and shops where it is impossible to follow the recommended social distancing guidelines of one metre.
Norway will therefore not go as far as neighbouring Denmark, which on Tuesday became the first European Union country to remove all of its domestic COVID curbs.
More than 781,000 cases and 1,440 virus-related deaths have been recorded in Norway, where almost 91 percent of the population is fully vaccinated.
Norway's public health institute estimates that three to four million people from a population of 5.4 million could have been infected by this summer.
Study shows GABAergic neurons in the hypothalamus trigger automatic defensive attacks in mice
by Ingrid Fadelli, Science X Network, Medical Xpress
When confronted by extreme threats, human and animals sometimes defend themselves by fighting against the threats. Xie and colleagues report that this important survival behavior is controlled by GABAergic neurons in the anterior hypothalamic nucleus of mice. The image illustrates a drunken hero who was bravely fighting against a ferocious tiger. Credit: Xie et al.
In his most renowned work, "On the Origin of Species," Charles Darwin introduced the idea that species need to continuously fight for their existence, and that only the most fit for a given environment survive. This notion, commonly known as the "survival of the fittest," has now been discussed and explored by countless scientists worldwide.
Although Darwin suggested that only the fittest species survive, he did not explain how this struggle for survival is reflected in the brains of humans and other animals. In recent years, many studies rooted in evolution, ethology and neuroscience have tried to answer this rather elusive question.
Researchers at the National Institute of Biological Sciences in Beijing, Beijing Normal University and other institutes in China have recently carried out a study investigating the neural underpinnings of innate defensive behaviors in mice. These are aggressive behaviors that animals can automatically adopt when responding to threatening stimuli in their environment.
The team's recent paper, published in Nature Neuroscience, suggests that mechanically evoked defensive behaviors are at least in part controlled by GABAergic neurons in the anterior hypothalamic nucleus (AHN), a central region within the frontal part of the hypothalamus. The anterior hypothalamus is a vital brain region known to be associated with self-regulatory bodily functions, including the regulation of the body's internal temperature and sleep.
"One goal of our laboratory is to elucidate how the brain initiates diverse behaviors for prey-predator competition, an important form of 'struggle for existence,'" Peng Cao, one of the researchers who carried out the study, told Medical Xpress. "In our previous studies, we systematically explored the brain circuits that underlie predator avoidance and prey capture in mice. In our new work we focused on anti-predator defensive attacks."
As part of their recent study, Cao and his colleagues carried out a series of experiments on mice. In these experiments, they triggered defensive behaviors in the mice using experimental stimuli and then tried to determine the neural underpinnings of these behaviors. They found that GABAergic neurons in the AHN mediated the mice's experimentally evoked defensive attacks.
"The brain area that may control mechanically evoked defensive attack should fit three fundamental criteria," Cao explained. "Firstly, the inhibition of neurons in this brain area should suppress mechanically evoked defensive attacks. Second, the neurons in this brain area should encode mechanical force and optimally respond to noxious mechanical stimuli. Third, the activation of these neurons should evoke attack behavior and suppress other types of ongoing behaviors."
In their experiments, Cao and his colleagues were able to show that GABAergic neurons in the AHN fit all these three criteria. Their study thus identifies the AHN as a central brain center behind defensive attack behaviors in mice.
"We found that photoinhibition of vGAT+ AHN neurons abrogated mechanically evoked defensive attack," Cao said. "Then, using fiber photometry, we found that vGAT+ AHN neurons specifically respond to noxious mechanical stimuli. Finally, using single-unit recording, we showed that vGAT+ AHN neurons encode the strength of mechanical force. We found that photostimulation of vGAT+ AHN neurons evoked attack behavior toward a live predator and suppressed other ongoing behaviors."
The findings gathered by this team of researchers shed some new light on the neural underpinnings of survival-related defensive behaviors in mice. Future studies could try to determine whether GABAergic neurons in the AHN are responsible for these same behaviors in other animal species, including humans.
As many violent crimes in human societies arise in response to aggression or danger, this recent study could ultimately have broad and far-reaching implications. For instance, it could pave the way towards a better understanding of how the human brain initiates violent crimes in response to real or perceived environmental threats.
"We now plan to expand our research on defensive attack," Cao added. "For instance, we will explore how the amygdala, the center of fear, may be involved in mechanically evoked defensive attack, which will then allow us to compare the involvement of the AHN and amygdala in defensive attack
Children with autism have difficulty sensing and comprehending internal bodily signals
by Zhang Nannan, Chinese Academy of Sciences
Credit: Unsplash/CC0 Public Domain
Dr. Raymond Chan's team from the Institute of Psychology of the Chinese Academy of Sciences has recently developed an innovative Eye-tracking Interoceptive Accuracy Task (EIAT) that is easily and specifically to examine interoceptive accuracy in children with autistic spectrum disorders (ASD). Results were published on Autism Research on January 27.
Interoception refers to the awareness and integration of internal signals including heartbeats and breathing patterns. Empirical findings suggest that interoception correlates with understanding of self and other's emotional status and learning. Interoception is therefore important for us to maintain a physiological equilibrium to achieve an optimal functional of daily lives.
Children with autistic spectrum disorders (ASD), including those with high level of autistic traits are considered to exhibit impairment in interoceptive accuracy. However, the extant literature provides mixed findings and should be interpreted cautiously with different methodologies.
In this study, the researchers administered EIAT to 30 children with ASD, 20 children with comorbid ASD and attention-deficit/hyperactive deficits (ADHD), and 63 typically-developing children with high and low levels of autistic traits. They also collected subjective measures from parents of these children.
According to the researchers, ASD children with and without comorbid ADHD exhibited lower interoceptive accuracy than typically-developing children.
They also found that typically-developing children with high level autistic traits also exhibited reduced interoceptive accuracy comparing to typically-developing children with low level of autistic traits. Interoceptive accuracy was found to correlate with ASD and ADHD symptoms negatively. More importantly, they also found atypical cardiac interoception in children with ASD.
Taken together, these findings highlight that difficulties in sensing and comprehending internal bodily signals in childhood may be related to both ASD and ADHD symptoms. These findings have important implications to understand the altered sensory processing observed in children with ASD and ADHD.
Diễn Đàn Người Việt Hải Ngoại. Tự do ngôn luận, an toàn và uy tín. Vì một tương lai tươi đẹp cho các thế hệ Việt Nam hãy ghé thăm chúng tôi, hãy tâm sự với chúng tôi mỗi ngày, mỗi giờ và mỗi giây phút có thể. VietBF.Com Xin cám ơn các bạn, chúc tất cả các bạn vui vẻ và gặp nhiều may mắn.
Welcome to Vietnamese American Community, Vietnamese European, Canadian, Australian Forum, Vietnamese Overseas Forum. Freedom of speech, safety and prestige. For a beautiful future for Vietnamese generations, please visit us, talk to us every day, every hour and every moment possible. VietBF.Com Thank you all and good luck.