In an ischemic stroke, an artery in the brain is blocked by blood clots and the brain cells can no longer be supplied with blood as a result. Doctors must therefore act quickly and unblock the artery with the help of catheters. During the so-called mechanical thrombectomy, a lot of data has to be recorded and then transferred to various registers. Dr Nils Lehnen, senior physician at the Clinic for Diagnostic and Interventional Neuroradiology and Paediatric Neuroradiology at the University Hospital Bonn (UKB), has now discovered in a study that ChatGPT could be a great help in this data transfer. The results have now been published in the specialist journal “RADIOLOGY”.
When did the patient arrive, when was a CT scan performed, when was the first puncture, when could the blood flow be restored,… During mechanical thrombectomy, a range of data must be recorded in the patient report and then manually transferred to various registers for the clinical outcome and for prospective studies.
“This is a labor-intensive task that is also prone to transcription errors,” says Dr Nils Lehnen, who also conducts research at the University of Bonn. “We therefore asked ourselves whether an AI such as ChatGPT could perform this transfer faster and possibly even more reliably.”
In radiology, ChatGPT is already being tested in various procedures – for example, in the simplification of reports or in answering patient questions on breast cancer screening. However, whether ChatGPT can correctly extract data from free-text reports of a mechanical thrombectomy for a database and simultaneously generate clinical data was previously unexplored and was the research objective of this new study.
Dr Lehnen’s research group first created a German prompt for ChatGPT and tested it on 20 reports in order to identify errors and subsequently adapt the prompt. After the correction, the data extraction using ChatGPT was tested on 100 internal reports from the UKB. For optimal comparison, an experienced neuroradiologist also compiled the results without seeing the ChatGPT evaluation. The researchers then compared the results and found that ChatGPT had correctly extracted 94 per cent of data entries and no post-processing was required. The researchers only considered the ChatGPT data entries that exactly matched that of the expert to be correct. Any deviations, such as additional symbols, punctuation marks or synonyms, were categorized as incorrect.
To validate these results, the researchers tested a further 30 external reports with the same prompt. ChatGPT achieved 90 per cent correct data entries.
This suggests that ChatGPT could be an alternative to manually retrieving this data. However, the reports and the prompt were only created by us in German, so the results of our study may need to be confirmed for other languages. In addition, we still observed poor results for certain data points, which shows that human supervision is still needed. However, we expect that further optimization of the prompt will further improve the results and that ChatGPT can make work easier in this area in the future.”
Dr. Nils Lehnen, senior physician at the Clinic for Diagnostic and Interventional Neuroradiology and Paediatric Neuroradiology at the University Hospital Bonn (UKB)
Lehnen, N. C., et al. (2024). Data Extraction from Free-Text Reports on Mechanical Thrombectomy in Acute Ischemic Stroke Using ChatGPT: A Retrospective Analysis. Radiology. doi.org/10.1148/radiol.232741.
Liver inflammation, a common side-effect of cancers elsewhere in the body, has long been associated with worse cancer outcomes and more recently associated with poor response to immunotherapy. Now, a team led by researchers from the Abramson Cancer Center and Perelman School of Medicine at the University of Pennsylvania has found a big reason why.
In their study, published today in Nature Immunology, the researchers discovered that cancer-induced liver inflammation causes liver cells to secrete proteins called serum amyloid A (SAA) proteins, which circulate through the body and hinder the ability of T cells-;major anticancer weapons of the immune system-;to infiltrate and attack tumors elsewhere.
We want to better understand what causes cancer to resist or respond to immunotherapy to help design more effective strategies for patients. Our findings show that liver cells-;with their release of SAA proteins-;effectively serve as an immune checkpoint regulating anti-cancer immunity, making them a promising therapeutic target.”
Gregory Beatty, MD, PhD, senior author, associate professor of Hematology-Oncology and the director of Clinical and Translational Research for the Penn Pancreatic Cancer Research Center
The study builds on previous research from the team, including co-lead authors Meredith Stone, PhD, a research associate, and Jesse Lee, a graduate student, into liver inflammation in cancer: In a 2019 study, they showed how it promotes pancreatic tumor metastasis to that organ. In 2021, researchers from the Beatty Laboratory observed that systemic inflammation, involving many of the same molecules implicated in liver metastasis, is associated with worse responses to immunotherapies in pancreatic cancer patients. The latest study was designed to investigate in more detail how liver inflammation may block the effects of these immune-boosting therapies.
First, they looked at mouse models of pancreatic cancer, measuring the amount of T-cell infiltration in pancreatic tumors-;a basic indicator of anti-tumor immune activity. They found that mice with less T cell infiltration in their tumors tended to have more liver inflammation. These mice also showed stronger signs of an inflammatory signaling pathway called the IL-6/JAK/STAT3 pathway-;the same one the team had implicated in liver metastasis in their 2019 study.
The researchers next showed that STAT3 activation in liver cells is associated with the reduced production of immune cells called dendritic cells, which are critical for normal T cell responses. When the scientists deleted STAT3 from liver cells, dendritic cell production and T cell activity picked up, and tumors that previously had only low T cell-infiltration developed high T cell-infiltration.
Ultimately the team found that STAT3 activation in liver cells has its dendritic cell- and T cell-suppressing effect by inducing the production of SAA proteins, which target receptors on immune cells. Deleting the SAA proteins had the same immune-restoring effect as deleting STAT3, and increased survival times and the likelihood of cures in mice that had pancreatic tumors surgically removed.
To get a sense whether the mouse model findings would translate to humans, the researchers measured SAA levels in tissue samples from patients whose pancreatic tumors had been surgically removed and found that those with low SAA levels at surgery went on to have significantly longer survival times afterward.
“The translational findings in human patients highlight the likely clinical relevance of our discoveries in the mice,” Beatty said. “Now that we’ve shown how liver inflammation puts up a roadblock to immunotherapy, our next step is to see if the same pathway can be targeted to reverse inflammation in patients who already have liver metastasis.”
The research team is now working to set up further preclinical and eventually clinical studies of STAT3- and/or SAA-inhibiting agents as potential add-on therapies in combination with immunotherapy-;for example, prior to surgery-;that could improve cancer patient outcomes.
Support for the research was provided by the National Institutes of Health (T32 CA009140, T32-HL007439-41, K12-CA076931-21, R01-CA197916, R01CA245323, U01 CA224193 and U01 CA224175), the Damon Runyon Cancer Research Foundation, the PacMen Consortium, the US Department of Defense (W81XWH2110622, W81XWH2110621), Stand Up to Cancer, the Robert L. Fine Cancer Research Foundation, the Penn-Wistar SPORE in Skin Cancer, AACR-The Mark Foundation for Cancer Research, and the Pancreatic Cancer Action Network.
Stone, M. L., et al. (2024). Hepatocytes coordinate immune evasion in cancer via release of serum amyloid A proteins. Nature Immunology. doi.org/10.1038/s41590-024-01820-1.
Blessed thistle (Cnicus benedictus) is a plant in the family Asteraceae and also grows in our climate. For centuries, it has been used as a medicinal herb as an extract or tea, e.g. to aid the digestive system. Researchers at the Center for Pharmacology of University Hospital Cologne and at the Faculty of Medicine of the University of Cologne have now found a completely novel use for Cnicin under the direction of Dr Philipp Gobrecht and Professor Dr. Dietmar Fischer. Animal models as well as human cells have shown that Cnicin significantly accelerates axon (nerve fibres) growth. The study ‘Cnicin promotes functional nerve regeneration’ was published in Phytomedicine.
Rapid help for nerves
Regeneration pathways of injured nerves in humans and animals with long axons are accordingly long. This often makes the healing process lengthy and even frequently irreversible because the axons cannot reach their destination on time. An accelerated regeneration growth rate can, therefore, make a big difference here, ensuring that the fibers reach their original destination on time before irreparable functional deficits can occur. The researchers demonstrated axon regeneration in animal models and human cells taken from retinae donated by patients. Administering a daily dose of Cnicin to mice or rats helped improve paralysis and neuropathy much more quickly.
Compared to other compounds, Cnicin has one crucial advantage: it can be introduced into the bloodstream orally (by mouth). It does not have to be given by injection.
The correct dose is very important here, as Cnicin only works within a specific therapeutic window. Doses that are too low or too high are ineffective. This is why further clinical studies on humans are crucial.”
Professor Dr. Dietmar Fischer
The University of Cologne researchers are currently planning relevant studies. The Center for Pharmacology is researching and developing drugs to repair the damaged nervous system.
The current study received funding of around 1,200,000 euros from the Federal Ministry of Education and Research within the framework of the project PARREGERON.
In a recent Danish population-based cohort study published in the British Medical Journal, researchers analyzed the changes in lifetime risks of atrial fibrillation (AF) and complications. They compared the data between two periods, 2000-2010 and 2011-2022. They found that the lifetime risk of AF increased over the study period, and individuals with AF showed significant risks of heart failure (HF) and stroke over their lifetime.
AF poses a growing health concern globally, with a substantial projected increase in affected populations. While improvements in mortality rates have been observed, AF remains linked to increased risks of stroke, HF, and myocardial infarction (MI). Understanding and effectively assessing AF risk, including its long-term complications, are crucial for prevention efforts. Residual lifetime risk, a measure capturing cumulative disease risk over the remaining lifespan, offers valuable insights for public health initiatives and patient education. Despite previous studies on AF lifetime risk, data on temporal trends and comprehensive complication risks are lacking. Monitoring changes in AF burden is vital for evaluating management strategies and prevention efforts, especially amid evolving stroke prevention therapies. In the present investigation involving the Danish population, researchers aimed to assess the lifetime risk of AF and its associated complications and to analyze their temporal trends spanning from 2000 to 2022.
About the study
Data were gathered from national registries, including the Danish National Patient Registry for hospital stays and outpatient contacts, the Civil Registration System for demographics and vital status, and the Danish National Prescription Registry for medication information. The study included 3,574,903 Danish individuals without AF at or after the age of 45 between 2000 and 2022. About 51.7% of the participants were women. Those aged 95 years or older were excluded. Follow-up ended at incident AF, death, age 95 years, emigration, or period end. Primary analysis used 45 years as the index age, with secondary analyses for ages 55, 65, and 75 or older. Incident AF was identified from hospital diagnoses.
A total of 362,721 individuals were followed up upon newly diagnosed AF (46.4% females). Complications, including HF, stroke, MI, or systemic embolism, were recorded post-diagnosis. Exclusions comprised pre-existing complications and events within seven days of diagnosis. The diagnosis followed strict International Classification of Diseases 10 (ICD-10) criteria with high predictive values. Analyses were conducted for index ages 45, 55, 65, and 75 years.
Study populations were characterized by assessing medical history along with family income and educational attainment. The statistical methods included the use of the Aalen-Johansen estimator for cumulative incidence, pseudo-value regression, propensity score adjustment using logistic regression, stabilized inverse propensity weighting, and subgroup analyses with interaction testing.
Results and discussion
Age distributions were found to remain consistent across the periods, while hypertension, dyslipidemia, and diabetes prevalence rose over time, whereas HF and MI prevalence reduced.
The lifetime risk of AF at the age of 45 between 2000-2022 was 27.7%, with higher risk observed among men, those with a history of certain cardiovascular conditions, and individuals with higher socioeconomic status. From 2000-2010 to 2011-2022, there was an absolute increase in lifetime risk from 24.2% to 30.9%. This trend persisted across all subgroups, with slightly higher increases among men, individuals with HF or stroke history, and those without dyslipidemia. At ages 55, 65, and 75, the lifetime risk also showed an upward trajectory, with absolute increases between the two periods.
Among individuals diagnosed with AF, HF was the most common complication, with a lifetime risk of 41.2%, followed by stroke (21.4%), MI (11.5%), and diagnosed systemic embolism (1.8%). Men generally faced higher risks of HF and MI compared to women, while women had a higher risk of stroke post-AF. History of certain cardiovascular conditions significantly increased the risk of HF post-AF. Over time, a slight decrease in the lifetime risks of stroke (-2.5%) and MI (-3.9%) was observed.
The study reports the temporal patterns in lifetime risks associated with AF and its subsequent complications for the first time. However, the study is limited by its potential underestimation of incident events due to a lack of differentiation between AF and atrial flutter and the absence of data on lifestyle factors and causes of death, among others.
Conclusion
The present Denmark-wide study reveals a concerning trend: the lifetime risk of AF has increased from one in four to one in three over the past two decades. HF emerged as the most common complication following AF, with a lifetime risk twice that of stroke. While there were slight improvements in the lifetime risks of stroke, ischemic stroke, and MI after AF, the rates remained high. These findings highlight the urgent need for effective strategies to prevent HF and stroke in patients with AF.
Journal reference:
Temporal trends in lifetime risks of atrial fibrillation and its complications between 2000 and 2022: Danish, nationwide, population-based cohort study. Vinter N. et al., British Medical Journal, 385:e077209 (2024), DOI: doi:10.1136/bmj-2023-077209, https://www.bmj.com/content/385/bmj-2023-077209
In a recent study published in Nature Medicine, researchers developed a deep-learning approach for tumor origin differentiation using cytological histology (TORCH), recognizing malignancy and predicting tumor origin in hydrothorax and ascites using cytological pictures from 57,220 patients.
Cancers of unknown primary (CUP) sites are malignant illnesses diagnosed by histopathology as metastases but whose origin cannot be determined using usual diagnostic methods.
These illnesses frequently present as serous effusions and have a dismal prognosis despite combination chemotherapies. Immunohistochemistry predicts the most likely origin of CUP; however, researchers can detect a few cases using immunostaining cocktails. The accurate identification of primary sites is critical for successful and tailored therapy.
About the study
In the present study, researchers present TORCH, a deep learning algorithm, to identify cancer genesis based on cytological pictures from ascites and hydrothorax.
The researchers trained the model using four independent deep neural networks combined to produce 12 different models. Using cytological pictures, the researchers attempted to develop an artificial intelligence-based diagnostic model for predicting tumor origin among individuals with malignancy and ascites or hydrothorax metastases.
They tested and confirmed the AI system’s performance using cytological smear instances from multiple independent testing sets.
From June 2010 to October 2023, the researchers collected data from 90,572 cytological smear images from 76,183 cancer patients across four major institutions (Zhengzhou University First Hospital, Tianjin Medical University Cancer Institute and Hospital, Yantai Yuhuangding Hospital, AND Suzhou University First Hospital) as training data.
Respiratory disorders represented the highest percentage (30%, 17,058 patients) of malignant groupings.
Carcinoma accounted for 57% of ascites and hydrothorax cases, with adenocarcinoma being the most common group (47%, 27,006 patients). Only 0.6% of the squamous cell carcinomas metastasized to ascites or pleural effusion (n=346).
To test the generalizability and reliability of TORCH, the researchers included 4,520 consecutive patients from Tianjin Cancer Hospital (the Tianjin-P dataset) and 12,467 from Yantai Hospital (the Yantai dataset).
They randomly selected 496 cytology smear images from three internal testing sets to investigate whether TORCH might help junior pathologists improve their performance.
They compared the junior pathologists’ performance using TORCH to prior manual interpretation outcomes for both junior and older pathologists.
Researchers used attention heatmaps to interpret an AI model for cancer detection in 42,682 cytological smear pictures from patients at three major tertiary referral hospitals. The model was evaluated in real-world scenarios utilizing external testing datasets, which included 495 photos.
The study aims to enhance junior pathologists’ diagnostic abilities using TORCH. Ablation tests assessed the advantages of including clinical characteristics in tumor origin prediction and investigated the association between clinical factors and cytological images.
Results
The TORCH model, a novel technique for predicting tumor origins in cancer diagnosis and localization, has been evaluated on various datasets.
The findings revealed that TORCH had an overall micro-averaged one-versus-rest area under the curve (AUROC) reading of 0.97, with a top-1 accuracy of 83% and a top-3 accuracy of 99%. This enhanced TORCH’s prediction efficacy compared to pathologists, notably increasing junior pathologists’ diagnosis scores.
Patients with cancers of unknown primary whose first treatment approach was consistent with TORCH-estimated origins had a higher overall survival rate than those who received discordant therapy. The model demonstrated relatively dependable generalization and compatibility.
When coupled with five testing sets, TORCH had a top-1 accuracy of 83%, a top-2 accuracy of 96%, and a top-3 accuracy of 99%. It also produced similar micro-averaged one-versus-rest AUROC ratings in the low-certainty and high-certainty groups.
The study included 391 cancer patients, of which 276 were concordant and 115 discordant. After the follow-up period, 42% of the patients died, with 37% concordant patients and 53% discordant ones. Survival analysis revealed that concordant patients had considerably higher overall survival than discordant ones.
Poor smear preparation and image quality issues such as section folding, contaminants, or overstaining may contribute to AI overdiagnosis in pancreatic cancer. Researchers can address these flaws by meticulous manual processing throughout the data-screening step.
In the case of colonic cancer, slime took up the majority of the image’s area, which may have caused the AI model to ignore this critical aspect while reaching a diagnosis.
Conclusion
Based on the study findings, the TORCH model, an AI tool, has shown promise in clinical practice for predicting the primary system origin of malignant cells in hydrothorax and ascites.
It can distinguish between malignant tumors and benign illnesses, pinpoint cancer sources, and help in clinical decision-making in patients with cancers of unknown origin. The model performed well across five testing sets and outperformed four pathologists.
It can assist oncologists in selecting therapy for unidentified individuals with CUP, primarily adenocarcinoma, treated with empirical broad-spectrum chemotherapy regimens.
In a recent study published in the journal Science Translational Medicine, researchers investigated the impact of aging on immune response, viral dynamics, and nasal microbiome in 1031 hospitalized coronavirus disease 2019 (COVID-19) patients, using advanced profiling techniques to understand age-related differences in disease severity and immune function.
Age is a significant risk factor for severe COVID-19 outcomes, with older adults facing drastically higher risks of complications and mortality than younger individuals. Despite high vaccination rates, older adults are still profoundly vulnerable. Aging correlates with elevated levels of inflammatory cytokines, like interleukin-6 (IL-6), which are critical markers of COVID-19 severity, hinting at a link between aging and disease pathophysiology. Studies show that aging dampens both innate and adaptive immune responses, including reduced type I interferon (IFN) production. Additionally, older adults show enhanced inflammatory responses and impaired immune signaling when infected with Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2). Further research is needed to fully understand the complex interactions between aging, immune response variations, and COVID-19 severity to improve treatment strategies and outcomes for older populations.
About the study
The present study utilized data from 1,031 participants enrolled in the IMmunoPhenotyping Assessment in a COVID-19 Cohort (IMPACC) observational cohort, which involved 20 hospitals across 15 medical centers in the United States from May 5, 2020, to March 19, 2021. It involved hospitalized individuals with reverse transcription polymerase chain reaction (rt-PCR) confirmed SARS-CoV-2 infections, displaying typical COVID-19 symptoms. Blood and respiratory tract samples were collected within 72 hours of hospitalization, following a standardized protocol across participating institutions. Ethical approval was granted under the public health surveillance exception, with participant consent for follow-up involvement and data usage.
Statistical analysis was performed using R software. Initial assessments were done within 72 hours of hospital admission, followed by longitudinal evaluations at subsequent visits. Data analysis applied various statistical methods depending on the data type and required adjustments for factors like age, sex, and baseline disease severity. For longitudinal studies, age groups were divided into quintiles and analyzed for changes in viral abundance and immune response, employing linear and generalized additive models to account for the observed non-linear patterns. All p-values were adjusted using the Benjamini-Hochberg method, considering results statistically significant at p < 0.05.
Study results
The study involved analyzing blood and nasal swab specimens from 1,031 vaccine-naïve adults hospitalized with COVID-19. These participants were part of the IMPACC cohort, sourced from 20 hospitals across the United States. They were categorized into five age quintiles, ranging from 18 to 96 years, with each group comprising between 187 and 223 individuals. Samples were collected at the time of hospital admission and during up to five follow-up visits. The distribution of ages showed that older individuals were often more severely affected by the disease, evident in both the initial severity of symptoms and the outcomes, including mortality.
At the initial hospital visit, typically within 72 hours of admission, a range of diagnostic assays was conducted. These included transcriptional profiling of peripheral blood mononuclear cells (PBMCs) and nasal swabs, serum inflammatory protein profiling, whole blood mass cytometry (CyTOF), nasal metatranscriptomics, and SARS-CoV-2 antibody (Ab) assays. A significant finding from these initial tests was that older adults displayed higher viral loads and experienced delayed viral clearance compared to younger patients. Moreover, age-related differences in immune cell populations were noted, with older adults showing higher proportions of various monocyte subtypes and activated T cells but lower levels of naïve T and B cells.
The study’s longitudinal analysis revealed that these differences persisted over time, affecting viral load dynamics, antibody titers, and immune response. Specifically, the eldest participants not only retained high levels of the virus longer but also showed more significant fluctuations in antibody levels over time. Additionally, immune cell analysis by CyTOF highlighted that with advancing age, certain immune cell types, including different monocyte classes and differentiated natural killer cells, increased, suggesting shifts in immune system composition and function with age.
Changes in cytokine and chemokine levels measured in the participants’ serum further underscored the impact of aging on the immune response. Older individuals showed elevated levels of inflammatory markers at hospital admission, which were linked to more severe disease outcomes.
Moreover, the analysis extended to the nasal microbiome and upper respiratory gene expression, revealing age-associated changes in the microbial composition and host gene activity. Changes in Toll-like receptor signaling and other immune pathways were evident, suggesting that older adults experience different immune modulations, possibly influencing their susceptibility to severe outcomes.
A team of Montana researchers is playing a key role in the development of a more effective vaccine against tuberculosis, an infectious disease that has killed more people than any other.
One effort is underway at the University of Montana Center for Translational Medicine. The center specializes in improving and creating vaccines by adding what are called novel adjuvants. An adjuvant is a substance included in the vaccine, such as fat molecules or aluminum salts, that enhances the immune response, and novel adjuvants are those that have not yet been used in humans. Scientists are finding that adjuvants make for stronger, more precise, and more durable immunity than antigens, which create antibodies, would alone.
Eliciting specific responses from the immune system and deepening and broadening the response with adjuvants is known as precision vaccination. “It’s not one-size-fits-all,” said Ofer Levy, a professor of pediatrics at Harvard University and the head of the Precision Vaccines Program at Boston Children’s Hospital. “A vaccine might work differently in a newborn versus an older adult and a middle-aged person.”
The ultimate precision vaccine, said Levy, would be lifelong protection from a disease with one jab. “A single-shot protection against influenza or a single-shot protection against covid, that would be the holy grail,” Levy said.
Jay Evans, the director of the University of Montana center and the chief scientific and strategy officer and a co-founder of Inimmune, a privately held biotechnology company in Missoula, said his team has been working on a TB vaccine for 15 years. The private-public partnership is developing vaccines and trying to improve existing vaccines, and he said it’s still five years off before the TB vaccine might be distributed widely.
It has not gone unnoticed at the center that this state-of-the-art vaccine research and production is located in a state that passed one of the nation’s most extreme anti-vaccination laws during the pandemic in 2021. The law prohibits businesses and governments from discriminating against people who aren’t vaccinated against covid-19 or other diseases, effectively banning both public and private employers from requiring workers to get vaccinated against covid or any other disease. A federal judge later ruled that the law cannot be enforced in health care settings, such as hospitals and doctors’ offices.
In mid-March, the Bill & Melinda Gates Medical Research Institute announced it had begun the third and final phase of clinical trials for the new vaccine in seven countries. The trials should take about five years to complete. Research and production are being done in several places, including at a manufacturing facility in Hamilton owned by GSK, a giant pharmaceutical company.
Known as the forgotten pandemic, TB kills up to 1.6 million people a year, mostly in impoverished areas in Asia and Africa, despite its being both preventable and treatable. The U.S. has seen an increase in tuberculosis over the past decade, especially with the influx of migrants, and the number of cases rose by 16% from 2022 to 2023. Tuberculosis is the leading cause of death among people living with HIV, whose risk of contracting a TB infection is 20 times as great as people without HIV.
“TB is a complex pathogen that has been with human beings for ages,” said Alemnew Dagnew, who heads the program for the new vaccine for the Gates Medical Research Institute. “Because it has been with human beings for many years, it has evolved and has a mechanism to escape the immune system. And the immunology of TB is not fully understood.”
The University of Montana Center for Translational Medicine and Inimmune together have 80 employees who specialize in researching a range of adjuvants to understand the specifics of immune responses to different substances. “You have to tailor it like tools in a toolbox towards the pathogen you are vaccinating against,” Evans said. “We have a whole library of adjuvant molecules and formulations.”
Vaccines are made more precise largely by using adjuvants. There are three basic types of natural adjuvants: aluminum salts; squalene, which is made from shark liver; and some kinds of saponins, which are fat molecules. It’s not fully understood how they stimulate the immune system. The center in Missoula has also created and patented a synthetic adjuvant, UM-1098, that drives a specific type of immune response and will be added to new vaccines.
One of the most promising molecules being used to juice up the immune system response to vaccines is a saponin molecule from the bark of the quillay tree, gathered in Chile from trees at least 10 years old. Such molecules were used by Novavax in its covid vaccine and by GSK in its widely used shingles vaccine, Shingrix. These molecules are also a key component in the new tuberculosis vaccine, known as the M72 vaccine.
But there is room for improvement.
“The vaccine shows 50% efficacy, which doesn’t sound like much, but basically there is no effective vaccine currently, so 50% is better than what’s out there,” Evans said. “We’re looking to take what we learned from that vaccine development with additional adjuvants to try and make it even better and move 50% to 80% or more.”
By contrast, measles vaccines are 95% effective.
According to Medscape, around 15 vaccine candidates are being developed to replace the BCG vaccine, and three of them are in phase 3 clinical trials.
One approach Evans’ center is researching to improve the new vaccine’s efficacy is taking a piece of the bacterium that causes TB, synthesizing it, and combining it with the adjuvant QS-21, made from the quillay tree. “It stimulates the immune system in a way that is specific to TB and it drives an immune response that is even closer to what we get from natural infections,” Evans said.
The University of Montana center is researching the treatment of several problems not commonly thought of as treatable with vaccines. They are entering the first phase of clinical trials for a vaccine for allergies, for instance, and first-phase trials for a cancer vaccine. And later this year, clinical trials will begin for vaccines to block the effects of opioids like heroin and fentanyl. The University of Montana received the largest grant in its history, $33 million, for anti-opioid vaccine research. It works by creating an antibody that binds with the drug in the bloodstream, which keeps it from entering the brain and creating the high.
For now, though, the eyes of health care experts around the world are on the trials for the new TB vaccines, which, if they are successful, could help save countless lives in the world’s poorest places.
This article was reprinted from khn.org, a national newsroom that produces in-depth journalism about health issues and is one of the core operating programs at KFF – the independent source for health policy research, polling, and journalism.
A novel SPECT/CT acquisition method can accurately detect radiopharmaceutical biodistribution in a convenient manner for prostate cancer patients, opening the door for more personalized treatment.
Utilizing lead-212 (212Pb), the new imaging technique has the potential to change practice and increase access for patients around the world. The first-in-human images from this method were published in the April issue of The Journal of Nuclear Medicine.
There is significant interest in the development of 212Pb-PSMA–based targeted alpha therapy (TAT) for patients with metastatic castration-resistant prostate cancer. However, 212Pb is a challenging isotope to image because of the high-energy gamma rays generate significant scatter.
The ability to acquire imaging of an alpha-emitter with a standard SPECT camera and standard collimator within a convenient acquisition time for the patient could provide more precision in how we treat patients with prostate cancer, and patients with other cancers, in the future. Confirming the presence of the drug in the target is important because it serves as a quality assurance and can be used to derive an understanding of the biodistribution and pharmacokinetics of the drug.”
Stephen Rose, PhD, head of Translational Medicine and Clinical Science at AdvanCell
In the study, researchers administered 60 MBq of 212Pb-ADVC001 to a 73-year-old man with metastatic castration-resistant prostate cancer. SPECT/CT imaging occurred at 1.5, 5, 20, and 28 hours after infusion.
Representative 212Pb SPECT/CT images showed rapid tumor uptake of 212Pb-ADVC001 in agreement with tumor burden shown on the pretreatment 18F-DCFPyl PET/CT images. Images acquired after 20 hours showed persistent tumor uptake despite low counts due to 212Pb decay.
“In the future, this imaging technique can help to streamline the drug development process, driving conviction in the agents we bring to larger scale trials. In addition, the ability to image 212Pb with a standard SPECT camera in a relatively short timeframe means that 212Pb is a true theranostic alpha-emitter and could be a valuable in selecting patients for targeted alpha-therapies,” said Rose
He continued, “What’s more, access to PET imaging is a bottleneck, in the United States and globally. SPECT cameras are more widely available and may address this critical issue, as SPECT imaging can be used for patient selection, therapy decision making, and guiding adaptive dosing strategies based on changes of target expression and tumor volume during treatment.”
Griffiths, M. R., et al. (2024). First-in-Human 212Pb-PSMA–Targeted α-Therapy SPECT/CT Imaging in a Patient with Metastatic Castration-Resistant Prostate Cancer. Journal of Nuclear Medicine. doi.org/10.2967/jnumed.123.267189.
Myocardial infarction, more commonly known as a heart attack, is a leading cause of death worldwide. Biomarkers called plasma metabolites may play a key role in the physiological pathways involved in myocardial infarctions. Recently published research used a methodological approach called bidirectional Mendelian randomization to understand more about these biomarkers and what they can tell doctors about heart attack risk.
The research was published in the Journal of Geriatric Cardiology on February 28.
Bidirectional Mendelian randomization studies represent a robust methodological approach, with numerous advantages not commonly present in traditional research methodologies. These include mitigating the impact of confounding factors on conclusions and exploring reverse causation, thereby providing a more reliable foundation for casual inferences. This study employed a bidirectional Mendelian randomization approach to investigate the relationship between plasma metabolites and myocardial infarction, offering new insights into the early diagnosis and potential treatment of myocardial infarction.”
Qiang Wu from the Senior Department of Cardiology, the Sixth Medical Center, Chinese PLA General Hospital, Beijing, China
By using data sets from large-scale genome-wide association studies, researchers were able to cast a wide net to try and understand more about the role plasma metabolites play in myocardial infarction. The data source included 461,823 individuals of European descent. Of those, 20,917 individuals had myocardial infarction and 440,906 individuals did not. A total of 24,172,914 single nucleotide polymorphisms were identified in that data set to be associated with myocardial infarction. Bidirectional Mendelian randomization narrows down this large amount of data and determines the relationship between plasma metabolites and myocardial infarction.
This analysis uncovered 198 unique plasma metabolites that were identified to have a significant association with myocardial infarction, of which 14 plasma metabolites had a direct relationship with myocardial infarction risk. “We identified 14 plasma metabolites associated with myocardial infarction, of which 8 plasma metabolites were linked to a decreased risk and 6 plasma metabolites were linked to an increased risk, underscoring the complicated nature of metabolic pathways influencing heart attack risk,” said Dong-Hua LI from Department of Cardiovascular Medicine, Minzu Hospital of Guangxi Zhuang Autonomous Region, Guangxi, China. “The robustness of our findings was strengthened by the application of bidirectional Mendelian randomization, enabling a thorough exploration of causality.”
Of the 14 plasma metabolite biomarkers identified in this study, 13 plasma metabolite biomarkers had never been identified as potential biomarkers associated with myocardial infarction before. These biomarkers offer a new option for developing diagnostic tests, routine screenings, and treatments for heart attack.
Looking to next steps, researchers are hoping to learn more about the mechanisms of these plasma metabolites and how they are related to myocardial infarction. For example, there were 8 plasma metabolites that were associated with a decreased risk of myocardial infarction and researchers speculate that anti-inflammatory properties associated with metabolites are at play, reducing oxidative stress in the body. However, additional research is needed to confirm this hypothesis.
“Timely detection using metabolic signatures could usher in a new era of preventive cardiology, where interventions are tailored to an individual’s metabolic profile. Furthermore, understanding the metabolic underpinnings of myocardial infarction will contribute to the development of point-of-care diagnostic tools, providing rapid and accessible assessments. Thus, the findings of the study can revolutionize clinical practice by enabling early and precise diagnoses, ultimately causing more effective and tailored treatment strategies,” said Qiang SU from Department of Cardiology, Jiangbin Hospital of Guangxi Zhuang Autonomous Region, Guangxi, China.
Other contributors include Jing-Sheng LAN, You-Yi HUANG, Lan-Jin WU, Zhi-Qing QIN, and Ying HUANG from Minzu Hospital of Guangxi Zhuang Autonomous Region in Guangxi, China; Shuo CHEN and Xin HAO from Chinese PLA General Hospital in Beijing, China; and Wan-Zhong HUANG, Ting ZENG, and Hua-Bin SU from Jiangbin Hospital of Guangxi Zhuang Autonomous Region in Guangxi, China.
The Guangxi Natural Science Foundation, the Key Research and Development Program of Guangxi, and the Chongzuo Science and Technology Bureau Planning Project funded this research.
Li, D.-H., et al. (2024). Plasma metabolites and risk of myocardial infarction: a bidirectional Mendelian randomization study. Journal of Geriatric Cardiology. doi.org/10.26599/1671-5411.2024.02.002.
With large language models that take notes during patient visits and algorithms that identify disease, artificial intelligence has begun to prove its worth as an assistant for physicians. But a new study from Stanford Medicine shows the potential of AI as a facilitator -; one that helps doctors and nurses connect to achieve more efficient, effective patient care.
The study, which published in JAMA Internal Medicine last month, describes an AI-based model in use at Stanford Hospital that predicts when a patient is declining and flags the patient’s physicians and nurses. Ron Li, MD, a clinical associate professor of medicine and medical informatics director for digital health who is the senior author on the study, said the alert system helps clinicians connect more efficiently and effectively as well as intervene to prevent patients from deteriorating and landing in the intensive care unit.
Li, who worked with informatics postdoctoral scholar and lead author Robert Gallo, MD, on the evaluation, discussed their team’s approach to harnessing the algorithm and how it fosters clinician connection in a ceaselessly buzzing hospital environment. Lisa Shieh, MD, PhD, clinical professor of medicine; Margaret Smith, director of operations for primary care and population health; and Jerri Westphal, nursing informatics manager, also helped lead the study and the implementation of the AI system.
What is a deterioration model and how does AI fit in?
The algorithm is a prediction model that pulls data -; such as vital signs, information from electronic health records and lab results -; in near-real time to predict whether a patient in the hospital is about to suffer a health decline. Physicians aren’t able to monitor all of these data points for every patient all of the time, so the model runs in the background, looking at these values about every 15 minutes. It then uses artificial intelligence to calculate a risk score on the probability the patient is going to deteriorate, and if the patient seems like they might be declining, the model sends an alert to the care team.
What’s the benefit of having such a model run in a hospital?
The big question I want to answer is, “How do we use AI to build a more resilient health system in high-stakes situations?” There are many ways to do that, but one core characteristic for a resilient system is strong communication channels. This model is powered by AI, but the action it triggers, the intervention, is basically a conversation that otherwise may not have happened.
Nurses and physicians have conversations and handoffs when they change shifts, but it’s difficult to standardize these communication channels due to busy schedules and other hospital dynamics. The algorithm can help standardize it and draw clinicians’ attention to a patient who may need additional care. Once the alert comes into the nurse and physician simultaneously, it initiates a conversation about what the patient needs to ensure they don’t decline to the point of requiring a transfer to the ICU.
Tell me about how your team implemented and evaluated the model.
We integrated this model, which we did not create, into our workflow, but with a few tweaks. Originally, it sent an alert when the patient was already deteriorating, which we didn’t find very helpful. We adjusted the model to focus on predicting ICU transfers and other indicators of health decline.
We wanted to ensure the nursing team was heavily involved and felt empowered to initiate conversations with physicians about adjusting a patient’s care. When we evaluated the tool, which we had running for almost 10,000 patients, we saw a significant improvement in clinical outcomes -; a 10.4% decrease in deterioration events, which we defined as transfers to the ICU, rapid response team events, or codes -; among a subset of 963 patients with risk scores within a “regression discontinuity window,” which basically means they’re at the cusp of being high risk. These are patients whose clinical trajectory may not be as obvious to the medical team. For that group of patients, this model was especially helpful for encouraging physicians and nurses to collaborate to determine which patients need extra tending.
How have nurses and physicians responded to the integration of this new model?
The model is far from perfect. The reactions have overall been positive, but there is concern about alert fatigue, since not all alerts are flagging a real decline. When the model was validated on data from patients prior to implementation, we calculated that about 20% of patients flagged by the model did end up experiencing a deterioration event within six to 18 hours. At this point, even though it’s not a completely accurate model, it’s accurate enough to warrant a conversation. It shows that the algorithm doesn’t have to be perfect for it to be effective.
With that said, we want to improve the accuracy; you need to do that to improve trust. That’s what we’re working on now.
Source:
Journal reference:
Gallo, R. J., et al. (2024). Effectiveness of an Artificial Intelligence–Enabled Intervention for Detecting Clinical Deterioration. JAMA Internal Medicine. doi.org/10.1001/jamainternmed.2024.0084.