|Year : 2021 | Volume
| Issue : 1 | Page : 61-71
|Abstracts for the 38th annual emergencies in medicine conference
Sukaina Ali Alali1, Jean W Hsu1, Abraham Akbar1, Robert D Welch2, Joseph Gibbs3, Charles V Pollack4, J Fanikos5, E Chebolu6, Jennifer Nguyen7, Gregory J Fermann8, Mufaddal Jivanjee9, James Williams10, Bryan F Imhoff11, Kristin L Rising4, R Isaacs12, Teo Zhongyang13, Charles E Mahan14, Tinh Le15, Lauren Rosenblatt12, I Gueye16, W Frank Peacock1
1 Baylor College of Medicine, Houston, USA
2 Wayne State University, Detroit, Michigan, USA
3 Henry Ford Health System, Detroit, Michigan, USA
4 Department of Emergency Medicine, Thomas Jefferson University, Philadelphia, PA, USA
5 Brigham and Women's Hospital, Boston, MA, USA
6 National Institute on Alcohol Abuse and Alcoholism, Office of the Clinical Director, Bethesda, USA
7 University of Houston College of Pharmacy, Houston, USA
8 Department of Emergency Medicine, University of Cincinnati, Cincinnati, USA
9 London North West Healthcare NHS Trust, London, The United Kingdom
10 Northwest Texas Healthcare System, Texas Tech University Health Science Center, Texas, USA
11 University of Kansas Medical Center, Kansas City, Kansas, USA
12 Department of Emergency Medicine, University of Maryland School of Medicine, Baltimore, MD, USA
13 Department of Emergency Medicine, Singapore General Hospital, Singapore
14 University of New Mexico College of Pharmacy, Albuquerque, NM, USA
15 Case Western Reserve University School of Medicine, Cleveland, OH, USA
16 Inter Army Medical Center of Ziguinchor, Senegal
Click here for correspondence address and email
|Date of Submission||29-Nov-2020|
|Date of Acceptance||02-Dec-2020|
|Date of Web Publication||23-Mar-2021|
|How to cite this article:|
Alali SA, Hsu JW, Akbar A, Welch RD, Gibbs J, Pollack CV, Fanikos J, Chebolu E, Nguyen J, Fermann GJ, Jivanjee M, Williams J, Imhoff BF, Rising KL, Isaacs R, Zhongyang T, Mahan CE, Le T, Rosenblatt L, Gueye I, Peacock W F. Abstracts for the 38th annual emergencies in medicine conference. J Emerg Trauma Shock 2021;14:61-71
|How to cite this URL:|
Alali SA, Hsu JW, Akbar A, Welch RD, Gibbs J, Pollack CV, Fanikos J, Chebolu E, Nguyen J, Fermann GJ, Jivanjee M, Williams J, Imhoff BF, Rising KL, Isaacs R, Zhongyang T, Mahan CE, Le T, Rosenblatt L, Gueye I, Peacock W F. Abstracts for the 38th annual emergencies in medicine conference. J Emerg Trauma Shock [serial online] 2021 [cited 2021 Apr 17];14:61-71. Available from: https://www.onlinejets.org/text.asp?2021/14/1/61/311792
Metabolomic Profiling during Diabetic Ketoacidosis and the Pathophysiology of Ketosis-Prone Diabetes
Jean W. Hsu, Paras Mehta, Kelly R. Keene, Surya N. Mulukutla, Eunice I. Caducoy, William Peacock, Ashok Balasubramanyam, Farook Jahoor
Baylor College of Medicine, Houston, TX, USA
Background: Obese patients with ketosis-prone type 2 diabetes (“A-B+ KPD”) have increased catabolism of the ketogenic amino acids leucine/isoleucine (Leu/Ile) and impaired ketone oxidation when stable and normoglycemic. Methods: We performed comparative serum metabolomics on patients at presentation with hyperglycemic crisis without (N = 29) or with (N = 73) diabetic ketoacidosis (DKA) compared to healthy controls (N = 17). The diabetic groups included normal weight and overweight/obese subgroups. Results: DKA patients had higher plasma cortisol and catecholamines and lower C-peptide than the other groups. Three metabolite patterns distinguished the DKA patients from other groups: First, elevated levels of Leu/Ile and of their respective ketoacids, no difference in isovaleryl carnitine (C5) but markedly lower ratio of C5 to C2 (acetyl carnitine), implying accelerated shunting of Leu/Ile into their catabolic pathways but impaired oxidation because defective TCA cycle activity restrains acetyl CoA entry and metabolism. These changes were most pronounced in obese DKA patients, recapitulating the pattern of A-B+ KPD patients when stable. Second, markedly decreased levels of glutamate, ammonia, and many nonessential amino acids, including citrulline, in the presence of increased glutamine, suggesting a defect in glutaminase activity. Third, low levels of 3-methyl histidine, suggesting a surprising decrease in muscle protein breakdown. Conclusion: The pattern of amino acids and their metabolites among the DKA (especially obese) patients indicates that both increased ketone production from Leu/Ile and their blunted oxidation contribute to the proclivity to develop DKA; elevated leucine and low citrulline could contribute to beta cell dysfunction via the mechanistic target of rapamycin hyperactivity and diminished intracellular arginine availability. Thus, serum metabolomic profiling during an acute DKA episode helps define a unique pathophysiology underlying the phenotype of obese ketosis-prone type 2 diabetes.
Keywords: diabetic ketoacidosis, ketosis prone diabetes, metabolomic profiling, type 2 diabetes
The Efficacy of Public Hemorrhage Control Training: A Statewide Cohort Study
Abraham Akbar, Alexander Pop1, M. Ashar Afaq, Jomari Guerrero2, Ryan Richardson, Jonathan Brewer3, Theresa Tran
Baylor College of Medicine, Houston, 1University of Texas Medical Branch, Galveston, 2University of Texas Health Science Center, San Antonio, 3Texas A&M Health Science Center, Bryan, TX, USA
Background: The objective is to investigate hemorrhage control (HC) training for laypersons in a public setting. We hypothesize that such training will enable participants to quickly increase knowledge of basic HC, comfort level in their ability, and likelihood to render aid. Methods: This was a prospective cohort study of participants involved in the Lone Star Survival (LSS) project in Texas, a volunteer organization that teaches laypersons HC through free 10-min lessons. Inclusion criteria were adults aged ≥18 years who participated in LSS training. Exclusion criteria were prior military experience or HC training. Questionnaires assessing subjective and objective parameters were administered pre- and post-training. The responses collected included data on correct order of HC steps, appropriate criteria for tourniquet use, appropriate placement of tourniquet, self-rated comfort level in ability, and self-rated likelihood to render aid, as well as demographic factors such as age, sex, and race. This was completed in 19 public sites (e.g., shopping centers) in eight cities in Texas. The HC instructors, trained in the LSS standardized curriculum, were medical students and physicians from 11 Texas medical schools. Data were analyzed using paired t-tests, with α = 0.05. Chi-square tests were used to analyze age, sex, and race data. Intersite data were compared to identify any possible discrepancies. Results: Out of 879 participants trained, 280 (31.9%) completed the surveys. 45.3% were male, with a mean (standard deviation [SD]) age of 29.8 years, and 39.0% were Caucasian, 31.4% Hispanic, 11.8% Asian, 8.6% African American, and 5.7% mixed/other. On the self-rated 1–10 scale, there was a mean increase in comfort level by 3.36 (95% confidence interval [CI] 3.07–3.65, P < 0.001) and in the likelihood to render aid by 2.49 (95% CI 2.20–2.77, P < 0.001). In objective assessments, there was a mean increase regarding correct order of HC steps of 14.8% (95% CI 9.2%–20.3%, P < 0.001), appropriate criteria for tourniquet usage (SD) of 12.6% (95% CI 8.0%–17.2%, P < 0.001), and appropriate placement of tourniquet (SD) of 35.5% (95% CI 28.9%–42.1%, P < 0.001). There were no significant intersite differences. Conclusion: A 10-min standardized, hands-on HC training program for laypersons in public areas improved knowledge of basic HC, comfort level in ability, and likelihood to render aid.
Keywords: Hemorrhage control, Lone Star Survival, public training
Biomarker Panel versus Canadian Computed Tomography Head Rule for Traumatic Intracranial Injury
Robert D. Welch, Linda Papa1, Jeff Bazarian2, Rob Howard3, James Chen4, Art Weber5, Syed Ayaz, Lawrence Lewis6
Wayne State University, Houston, TX, 1Orlando Regional Medical Center, Orlando, 5Banyan Biomarkers Inc., Alachua, Florida, 2University of Rochester, Rochester, NY, 4University of California San Diego, La Jolla, CA, 6Washington University St. Louis, Missouri, USA, 3Veridical Solutions
Introduction: Serum biomarker panel (SBP) predicts the absence of computed tomographic (CT) findings in patients with mild-to-moderate traumatic brain injury (TBI) and includes glial fibrillary acidic protein (GFAP) and ubiquitin carboxyl-terminal hydrolase L1 (UCH-L1). The Canadian CT Head Rule (CCTHR) is a clinical decision tool to predict need for neurosurgical intervention and demonstrate “clinically important” CT findings. We evaluated the test characteristics of SBP and CCTHR for head CT findings in mild TBI patients. Methods: Secondary analysis of the prospective multicenter ALERT-TBI study was performed. Inclusion criteria were suspected nonpenetrating TBI patients aged >18 years. Inclusions were Glasgow coma scale of 14–15, loss of consciousness, amnesia and/or confusion, no anticoagulants, and no seizure postinjury, with a CT done per usual care at each of 15 USA and 7 European sites. CT and blood draw completed within 12 h of injury. Exclusion criteria were unable to determine time of injury or obtain consent, stroke or neurosurgery within 30 days, neurodegenerative disease, seizure, brain tumor, pregnancy/breast feeding, and blood transfusions before blood sample obtained. CT images were interpreted by two neuroradiologists. CT positive was defined as intracranial bleeding or other signs of brain injury. Prespecified cutoffs were 22 pg/mL and 327 pg/mL for GFPA and UCH-L1, respectively. SBP was deemed negative if both were below cutoff and positive if either or both were above cutoff. Data analysis was SBP and CCTHR versus the gold standard CT. Institutional review board approved at each study site. Results: Of 1959 enrolled subjects, 919 met our inclusion criteria. Overall, 67 (7.3%) had CT positive, while 852 (92.7%) had CT negative. 563 (61.3%) were male, and 123 (13.4%) belonged to >65 years of age. Sensitivity, specificity, and NPV were 70.1%, 55.5%, and 95.5%, versus 95.5%, 38.8%, and 99.1% for the CCTHR and SBP, respectively. Conclusion: Sensitivity and NPV were higher for the SBP versus the CCTHR in all instances. CCTHR had higher specificity. Therefore, SBP is better able to decrease head CT utilization.
Keywords: Biomarker panel, computed tomography, traumatic brain injury
Risk Stratification in Chest Pain: High-Sensitivity Troponin Assay into Existing Risk Tools
Joseph Gibbs, Richard Nowak, W. Frank Peacock, Simon Mahler, Christopher deFilippi, Robert Christianson, Gordon Jacobsen, Fred Apple, James McCord
Henry Ford Health System, Detroit, Michigan, USA
Introduction: Risk scores such as thrombolysis in myocardial infarction (TIMI); History, Electrocardiography, Age, Risk factors, and Troponin (HEART); and Simplified Emergency Department Assessment of Chest Pain Score (sEDACS) have been used to evaluate patients with possible acute myocardial infarct (AMI). We studied the prognostic utility of the TIMI, HEART, and sEDACS scores when supplemented with high-sensitivity cardiac troponin-I (hs-cTnI). Methods: The study included 1924 suspected AMI patients at 29 hospitals in the United States from 2015 to 2016. Blood samples were drawn at enrollment and 2–3 h and tested for hs-cTnI on the Atellica IM TNIH Assay (Siemens Diagnostics). Patients were considered low risk with a TIMI score = 0, HEART ≤3, sEDACS ≤15, and hs-cTnI <45 ng/L (99th%) at time 0 and 2–3 h. Results: Of 1924 patients, 259 (13.5%) were diagnosed with AMI. At 30 days, there were 6 (0.3%) additional AMIs, 18 (0.9%) deaths, and 205 (10.7%) revascularizations. There were 430 patients (22.3%) with an elevated hs-cTnI during the index visit. All three risk scores combined with hs-cTnI identified a low-risk group. Compared to TIMI (10.8%), HEART (31.2%) and sEDACS (34.1%) scores defined more patients as low risk. Conclusions: The TIMI, HEART, and sEDACS scores all identify a low-risk group of patients when combined with serial hs-cTnI measurements. The HEART and sEDACS scores identified more low-risk patients. These patients could be considered for discharge from the emergency department without further testing.
Keywords: Chest pain, myocardial infarct, risk stratification, troponin
UPSTREAM Registry: Myocardial Infarction Patients Given Upstream Advanced Oral Antiplatelet Therapy
Charles V. Pollack, Jr.1,2, Durgesh D. Bhandary3, Alex Frost4, W. Frank Peacock5, Sunil V. Rao6, Steven H. Silber7, Deborah B. Diercks8, Renato DeRita3, Narinder Bhalla3, Sripal Bangalore9, Naeem D. Khan3
1Department of Emergency Medicine, Thomas Jefferson University, Philadelphia, PA, 2Hospital Quality Foundation, Shrewsbury, NJ, 3AstraZeneca Pharmaceuticals, Wilmington, DE, 4Studymaker LLC, Boston, MA, 5Baylor College of Medicine, Houston, 8UT Southwestern Med Center, Dallas, TX, 6Duke University, Durham, NC, 7New York Methodist Hospital, Brooklyn, NY, 9New York University School of Medicine, New York, NY, USA
Background: The upstream interval is the diagnostic and therapeutic management of suspected or confirmed acute coronary syndrome before diagnostic angiography. The primary components of upstream treatment are aspirin, anticoagulation, and antiplatelet therapy. The benefit and risk of upstream administration of these therapies should be individualized for each patient. The primary objective of the UPSTREAM study is to address the data gap regarding the course of non-ST-elevation myocardial infarction (NSTEMI) between emergency department (ED) arrival and diagnostic angiography and follow-up oral antiplatelet (OAP) agent use. Methods: The UPSTREAM is a phase IV, multicenter (up to 75 US hospitals), prospective, noninterventional, observational study of consecutive patients with a working diagnosis of NSTEMI and treatment with a greater-than-maintenance dose of an OAP agent (ticagrelor, clopidogrel, or prasugrel) 4–72 h upstream of diagnostic angiography. All upstream-OAP agent-treated patients who undergo diagnostic angiography are studied through discharge. Those treated in-hospital with ticagrelor and discharged from the hospital on ticagrelor (“ticagrelor-consistent cohort”) have follow-up of 30 (+10) days posthospitalization. Results: A total of 2944 patients were enrolled at 55 enrolling sites. 24% of the patients transferred into an UPSTREAM hospital. UPSTREAM interval length (before diagnostic catheterization): ED arrival to loading dose (median ± standard deviation) was 6.4 (6.7) h and loading dose to diagnostic angiography was 11.3 (9.5) h. Mean age was 63 years; 19% were aged over 75 years. Percentage of the patients receiving at least one dose of clopidogrel was 60%, ticagrelor was 54%, and prasugrel was 2.5%. In-hospital mortality was 0.4% overall. 30-day mortality in ticagrelor-consistent cohort was 0.006%. Conclusion: Upstream OAP in NSTEMI patients undergoing an early invasive strategy is associated with good clinical outcomes and minimal risk of bleeding complications. The typical upstream interval (ED arrival until diagnostic angiography) is approximately 18 h (median). 30-day mortality in NSTEMI managed with an early interventional strategy is very low. Full risk profile of the patients treated with upstream OAP needs to be more fully evaluated. UPSTREAM enrollment is ongoing.
Keywords: Myocardial infarction, oral antiplatelet, UPSTREAM
Andexanet Alfa for Intracranial Hemorrhage Associated with Factor Xa Inhibitors: Budget Impact
J. Fanikos, J. N. Goldstein1, B. Lovelace2
Brigham and Women's Hospital, 1Massachusetts General Hospital, Boston, MA, 2Portola Pharmaceuticals, South San Francisco, CA, USA
Background: Oral factor Xa inhibitors (oFXais) are used to treat and prevent thrombotic events but may exacerbate acute major bleeding. Andexanet alfa, a modified recombinant inactive form of human F-Xa, is a novel antidote for reversal of oFXai. This study calculated the budget impact of using andexanet alfa to treat intracranial hemorrhage associated with oFXai from a United States acute care hospital perspective. Methods: A decision tree framework was created to compare a world with andexanet alfa to a world without, where four-factor prothrombin complex concentrate (4F-PCC) was used. Andexanet alfa uptake was projected over 3 years based on real-world data. Patients entering the model were assigned a probability of hematoma expansion (≥33% volume increase from baseline). Patients with hematoma expansion were assigned an increased length of intensive care unit (ICU) stay and increased risk of intubation. Drug costs, cost of ICU days, and intubation costs were included. New technology add-on payments (NTAPs) were included as an offset against pharmacy costs for andexanet alfa. We calculated the costs per hospitalization in the two groups. Key cost drivers were identified using one-way sensitivity analyses. Results: The costs per hospitalization were $48,303 for andexanet alfa and $52,179 for 4F-PCC when NTAP was included. We projected a cost savings of $84,994 in year 1 to $102,843 in year 3 when comparing a world with andexanet alfa to one without. The most influential model parameters were the drug costs, andexanet alfa dosing assumptions, and the probability of hematoma expansion. Conclusion: Our analysis suggests that increasing andexanet alfa uptake in the treatment of oFXai reversal may result in net reductions in the costs to an acute care hospital.
Keywords: andexanet alfa, drug cost, factor Xa inhibitors, intracranial hemorrhage
Alcohol Response Phenotype Domains in Alcohol Use Disorder
E. Chebolu, M. L. Schwandt, V. A. Ramchandani, B. L. Stangl, D. T. George, Y Horneffer, T. Vinson, E. L. Vogt, B. A. Manor, N. Diazgranados, D. Goldman
National Institute on Alcohol Abuse and Alcoholism, Office of the Clinical Director, Bethesda, MD, USA
Background: Interindividual variation in response to alcohol is substantial and can bring people to emergency rooms (ERs) and pose challenges for medical management. This study characterizes the potential underlying factors of alcohol response phenotypes in patients with alcohol use disorder (AUD) and investigates the associations of alcohol response factors to other clinical and demographic patient characteristics. Methods: We performed a factor analysis of 17 item responses from the alcohol dependence scale in 938 individuals diagnosed with AUD using the Diagnostic and Statistical Manual of Mental Disorders (DSM)-V criteria or alcohol dependence/abuse using the DSM-IV criteria via structured psychiatric interview (SCID). Participants completed self-report questionnaires for alcohol consumption levels, family history, mental states, personality, cognition, behavior, and early life stress. These phenotypes were used in a multiple indicators multiple causes (MIMIC) analysis (n = 416) to identify clinical predictors of the latent factors. Results: A three-factor model was determined to be the best solution based on fit indices and interpretability of factor domains. The three alcohol response factor domains were physical symptoms (sickness, hangovers, and convulsions), perceptual disturbances (hallucinations, paresthesia), and blackouts. After accounting for variation in alcohol consumption, the MIMIC analysis showed that negative urgency (an impulsivity measure) predicted both physical symptoms and blackouts. Major depressive disorder and perceived stress predicted physical symptoms, while childhood trauma predicted perceptual disturbances. Lack of premeditation predicted less perceptual disturbances and aggression predicted less physical symptoms. The Alcohol Use Disorders Identification Test, a measure of AUD severity, score predicted all three domains. Conclusion: Alcohol response differed in distinct categories of AUD patients: personality traits, stress measures, and degree of alcohol-related problems predicted the factors and the items that defined them. Results indicate that patients presenting in an ER with one alcohol response problem are likely to be experiencing several problems that load on to the same factor and may be predicted by clinical characteristics.
Keywords: alcohol, alcohol response phenotype, alcohol use disorder
Dielectric Detection of Fluid Status in Emergency Department Dyspnea Patients
Jennifer Nguyen, Robert Petrovic1, Zubaid Rafique2, Navdeep Sekhon2, Robert McArthur2, Frank Peacock2
1University of Houston College of Pharmacy, 2Baylor College of Medicine, Houston, TX, USA
Background: The purpose of this study is to evaluate the accuracy of remote dielectric sensing (ReDS) to detect lung fluid in emergency department patients with undifferentiated shortness of breath. ReDS provides a noninvasive measurement of lung fluid status within 90 s, where normal lung fluid ranges from 20% to 35%. Methods: This is a single-center convenience sample pilot study with institutional review board approval at an academic emergency department. Inclusion criteria were chief complaint of shortness of breath, ≥21 years of age, provided informed consent, and nonpregnant. Exclusion criteria were chest trauma, anatomic abnormalities, implanted devices, body mass index <22 or >36, and height <5'1'' or >6'4''. Data such as demographics, vital signs, and medical history were collected from medical records. Patients were fitted with the ReDS vest and data were recorded. After discharge, a gold standard diagnosis and volume status were adjudicated by two emergency medicine physicians blinded to the ReDS data. Results: Of 114 enrolled patients, the mean age was 53.6 (±13.1) years, and 56% (n = 64) identified as male. Their ethnicity breakdown was 49.1% (n = 56) Hispanic, 32.5% (n = 37) African American, 11.4% (n = 8) White, and 7.0% (n = 8) Asian. Adjudicated gold standard diagnosis was found in 29.8% (n = 34) volume overloads. Using a cutoff of 35%, upper limit of physiologic lung fluid, ReDS detected 79.4% of the volume overload cohort. Sensitivity was 0.79, specificity was 0.69, negative predictive value was 0.89, and positive predictive value was 0.52. Conclusion: At a cut-point of 41%, the ReDS device has excellent specificity and negative predictive value in detecting pathologic lung fluid.
Keywords: Dyspnea, lung fluid, remote dielectric sensing, shortness of breath
Emergency Potassium Normalization Treatment Including Sodium Zirconium Cyclosilicate: The ENERGIZE study
W. Frank Peacock, Zubaid Rafique, Konstantin Vishnevskiy1, Edward Michelson2, Elena Vishneva3, Tatiana Zvereva4, Rajaa Nahra5, Dao Li6, Joseph Miller7
Ben Taub General Hospital, Houston, 2Texas Tech University Health Sciences Center, El Paso, TX, 5AstraZeneca, Gaithersburg, MD, 7Henry Ford Hospital, Detroit, MI, USA, 1Medical University of St Petersburg, St Petersburg, 3Russian Academy of Medical Science, Moscow, 4Kemerovo Medical University, Kemerovo, Russia, 6AstraZeneca, Gothenburg, Sweden
Background: Sodium zirconium cyclosilicate (SZC) is a novel potassium binder that is highly selective for potassium. SZC is approved in the United States and European Union for the treatment of hyperkalemia. The ENERGIZE study was a pilot evaluation which explored the efficacy of SZC when added to insulin and glucose as hyperkalemia treatment in the emergency department (ED). Methods: The ENERGIZE was an exploratory, randomized, double-blind, placebo-controlled, phase II study (NCT03337477). Enrolled patients were adults admitted to the ED with blood potassium ≥5.8 mmol/L. Patients receiving background treatment of insulin and glucose were randomized 1:1 to receive SZC 10 g or placebo, up to three times during a 10-h period. The primary efficacy outcome was the mean change in serum potassium (sK+) from baseline until 4 h after start of dosing with SZC or placebo. Results: Overall, 70 patients were randomized (SZC n = 33; placebo n = 37); 50.0% were male, the mean (standard deviation [SD]) age was 59.0 (13.8) years, and the mean initial sK+ was similar between groups (SZC 6.4 mmol/L; placebo 6.5 mmol/L). The least squares mean (LSM [SD]) change in sK+ from baseline to 4 h was −0.41 (0.11) mmol/L with SZC and −0.27 (0.10) mmol/L with placebo (difference: −0.13 mmol/L; 95% confidence interval [CI] = −0.44–0.17). At 2 h, a greater reduction in mean (SD) sK+ from baseline occurred with SZC versus placebo: −0.72 (0.12) with SZC versus −0.36 (0.11) mmol/L with placebo (LSM difference: −0.35 mmol/L; 95% CI = −0.68 to −0.02). A numerically lower proportion of patients required additional potassium-lowering therapy due to hyperkalemia at 0–4 h with SZC versus placebo (15.6% vs. 30.6%, respectively; odds ratio 0.40 [95% CI = 0.09–1.77]). Similar proportions of the patients experienced adverse events in both treatment groups at 0–24 h. Conclusion: This pilot study provided signals indicative of an incremental benefit of SZC with insulin and glucose in the emergency treatment of hyperkalemia compared with insulin and glucose alone.
Keywords: ENERGIZE, hyperkalemia, potassium, sodium zirconium cyclosilicate
Atrial Fibrillation Patients Taking Factor Xa Inhibitors: Economic Burden of Major Bleeding
Gregory J. Fermann, Belinda Lovelace1, Mary Christoph1, Melissa Lingohr-Smith2, Jay Lin2, Steven B. Deitelzweig3
Department of Emergency Medicine, University of Cincinnati, Cincinnati, OH, 1Portola Pharmaceuticals, South San Francisco, CA, 2Novosys Health, Green Brook, NJ, 3Department of Medicine, University of Queensland and Ochsner Clinical School, New Orleans, LA, USA
Background: Patients with atrial fibrillation (AF) treated with factor Xa inhibitors (FXaIs) are at risk for major bleeding (MB). The objective of this study was to evaluate the healthcare economic burden associated with MB among AF patients treated with FXaIs, using a retrospective claims database analysis. Methods: Adult patients (≥18 years of age) treated with FXaIs (rivaroxaban, apixaban, or edoxaban) who were hospitalized with MB (MB patients) or had no MB hospitalizations (non-MB patients) during January 1, 2015, through April 30, 2018, were extracted from the MarketScan claims databases. The index date was defined as the first MB inpatient hospitalization for MB patients; a random date during FXaI usage was selected as the index date for non-MB patients. Healthcare resource utilization and costs were evaluated for index MB hospitalizations for MB patients and during the 6-month period before the index event (baseline) and a variable follow-up period of 1–12 months for MB and non-MB patients. Costs were inflated to 2019 dollars and annualized. Results: Among the study population of AF patients treated with FXaIs (N = 152,305), 5.0% (N = 7577; mean age: 76.1 years; 44.1% female) had an MB hospitalization and 95.0% (N = 144,728; mean age: 70.1 years; 40.5% female) did not. For index MB hospitalizations, the mean length of stay was 5.3 days and cost was $32,938. During follow-up, MB versus non-MB patients had, on average, twice as many all-cause hospitalizations (1.1 vs. 0.4 per patient year, P < 0.001), longer total hospital length of stays (7.7 vs. 2.3 days per patient year, P < 0.001), and higher all-cause inpatient ($33,640 vs. $13,671 per patient, P < 0.001) and outpatient ($39,302 vs. $21,640 per patient, P < 0.001) costs. Conclusions: Among AF patients treated with FXaIs, the healthcare economic burden of an initial MB event is high, as well as following MB hospitalization, which was approximately twice that of patients without a MB event.
Keywords: Atrial fibrillation, factor Xa inhibitors, economic burden, major bleeding
Head Injury Imaging – Timing is Critical
Mufaddal Jivanjee, Vishal Shah, Julie Bak
London North West Healthcare NHS Trust, London, the United Kingdom
Background: Head injury is a common cause of morbidity and mortality in the first four decades of life. The majority of patients recover without intervention; however, some may develop long-term disabilities or even die. Early detection of pathology is therefore critical. The National Institute for Health and Care Excellence (NICE) adult head injury guidelines recommends that head injuries with specific risk factors should have a computed tomography (CT) scan within 1 h of said risk factors being identified. Furthermore, the provisional report should be made available within 1 h of the scan. This audit assessed the compliance of hospital staff to the NICE adult head injury guidelines. Methods: Forty adult CT head scans, requested due to head injuries, from the emergency department (ED) at London North West Healthcare (LNWH) NHS Trust were analyzed for compliance to the NICE guidelines. The standards measured were time from request of scan to completion of scan should be within 1 h and time from completion of scan to publication of provisional report should be within 1 h. The locally agreed target for both standards was 100%. Results: On review of 40 CT scans, 32 (80%) were completed within 1 h of request. From the 8 scans (20%) not completed within the hour, 4 were due to porter unavailability, 1 due to an uncooperative patient, and the remaining 3 reasons were not clear from documentation. Following completion of the scan, 38 scans (95%) were provisionally reported within 1 h. Conclusion: This study highlighted that a good compliance by hospital staff in ensuring patients with head injuries is managed appropriately, following detection of risk factors indicating a CT head scan. However, the locally agreed targets were not being met. One factor resulting in delayed scans was porter availability. An intervention recently introduced is the use of the e-portering application, which will endeavor to save time for referrers requesting porters and allow patient tracking. It is also worth educating porters, via e-mail bulletins, on the importance of priority scans, such as CT head following trauma. Furthermore, the findings of the audit were relayed to the radiology department to help improve reporting times and to the ED to re-emphasize prompt requesting of CT head scans when clinically indicated.
Keywords: Computed tomography, head injury, head injury guidelines
Insights of Strategies for Anticoagulation-Associated Hemorrhage
Northwest Texas Healthcare System, Texas Tech University Health Science Center, Texas, USA
Background: Anticoagulants alter the physiology of the coagulation cascade, and hemorrhage is the most common complication of their use. As the prevalence of direct oral anticoagulant (DOAC) use increases, so does the incidence of hemorrhage, particularly life-threatening/critical site hemorrhage, for which immediate treatment is required. Emergency physicians must understand the difference among the agent-specific treatment strategies so that the outcomes can be optimized. No head-to-head trials of the agents used in the treatment of anticoagulated-associated hemorrhage have been performed. In addition, for a variety of reasons, some clinicians and institutions have designed a treatment strategy using agents off-label. We aim to understand the nuanced but significant clinical effects of these agents, through cross-comparing trials. Methods: A cross-comparison of three trials of agent-specific anticoagulation-associated hemorrhage treatment strategies was performed. Results: Kcentra was used for patients on Vitamin K antagonist, Praxbind–ReverseAD for patients on direct thrombin inhibitors, and Anexxa4 for patients on Xa inhibitors. The inclusion criteria for each trial were international normalized ratio >2, Glasgow coma scale (GCS) <7, >30cc intracranial hemorrhage, survive <3 days for Kcentra, uncontrolled/life threat bleed, <8 h for Praxbind, and Xa inhibitor <18 h, life threat, site GCS <7, and >60cc for Anexxa4. Mortality was 9.7% versus 4.6% at 45 days for Kcentra, 13% and 12% at 30 days for Praxbind, and 14% at 30 days for Anexxa4. Conclusion: To optimize outcomes of patients with anticoagulation-associated life-threatening/critical site hemorrhage, it is essential to rapidly reverse anticoagulation in addition to provide all supportive measures used in a nonanticoagulated patient. The Kcentra, Reverse-AD, and Andexxa-4 trials were designed to either replace or reverse a specific anticoagulant. Their inclusion and exclusion criteria and efficacy and safety endpoints were as different as the mechanism of actions of the agents. It is essential to appreciate the difference in the trials so as not to presume the efficacy and safety of an agent will have the same effect when it is used in a different and unstudied patient population.
Keywords: Anticoagulation, anticoagulant reversal, bleeding, hemorrhage
How Clean Is a Community Hospital Healthcare Provider's Stethoscope?
Sukaina Ali Alali, Ekta Shrestha1, Aswin Ratna Kansakar1, Amishi Parekh1, Shahriar Dadkhah1, W. Frank Peacock
Department of Emergency Medicine, Baylor College of Medicine, Houston, Texas, 1Presence Saint Francis Hospital, Evanston, Illinois, USA
Background: In the United States, nosocomial infections are estimated to cause 72,000 annual deaths. The stethoscope, commonly used, is rarely reported as a potential vector. Our study aims to describe stethoscope contamination and the effect of self-reported cleaning practices among healthcare providers in a community hospital setting. Methods: Stethoscopes were collected at random times from healthcare providers and cultured per the standard techniques. Providers answered a structured questionnaire related to their cleaning practices. Differences in bacterial growth rates and the impact of profession, cleaning frequency, and prior sanitization were evaluated. Results: Of 104 cultured stethoscopes, 44% were from residents and medical students, from which 76% had bacterial growth, and 56% were from attendings, nurses, and respiratory therapists who had 91.4% growth (95% = confidence interval 0.62–0.86 and 0.81–0.96, respectively). Overall, 86.5% of providers claimed disinfection frequency compliant with the Centers for Disease Control and Prevention guidelines, but there were no statistical differences between self-reported cleaning frequency or methods and the presence of bacteria. Conclusion: Most stethoscopes are contaminated with bacteria, the presence of which was not affected by reported cleaning strategies.
Keywords: Cleaning practices, disinfection, stethoscope, stethoscope contamination
Gastrointestinal Bleed: Lactate Use for Emergency Department Risk Stratification and Resource Utilization
Bryan F. Imhoff, Samantha J. DeArmon, Brandon Ricke, Seth Atchison, Alexandria M. Larson, Daniel Kolm, Nicholas P. Dodson, Edric K. Wong, Kelly Howe, Niaman Nazir, Lucas Lemar, Chad M. Cannon
University of Kansas Medical Center, Kansas City, Kansas, USA
Background: Emergency department (ED) visits for acute gastrointestinal (GI) bleed account for approximately 20,000 annual United States deaths. Gauging the severity of the bleed can be difficult as initial vital signs and appearance can often be falsely reassuring. While there are multiple well-validated risk stratification tools available for GI bleeds, none are specific to the ED. This study evaluates lactate as a prognostic indicator for risk stratification and a predictor of in-hospital resource utilization. Methods: This study was a retrospective cohort chart review, spanning a 5-year period, at an urban, tertiary, academic medical center. A total of 245 normotensive patients presented during the study period. Lactate levels, among other data, were obtained on patients aged ≥18 years old who presented to the ED with a GI bleed. Hypotensive patients with GI bleed are high risk and therefore not difficult to risk-stratify. The study focused on normotensive (systolic blood pressure ≥90 mmHg) patients with GI bleed. The primary outcomes were in-hospital mortality and intensive care unit (ICU) admission. Additional outcome variables included GI/interventional radiology (IR) intervention, blood product administration, and anticoagulant reversal administration during hospitalization. Statistical evaluation was performed using SAS V9.4. [Table 1]. Conclusion: The study demonstrated an association between both elevated lactate and mortality as well as increased hospital resource utilization. It shows a positive correlation in utilization of ICU admissions, blood product administration, and anticoagulation reversal by the GI bleed patient subset groups stratified by increasing elevations of lactate. Surprisingly, the same increasing incidence among the subsets was not appreciated with GI or IR intervention. Therefore, the degree of lactate elevation should be considered during the disposition process by the emergency provider as a predictor of level of care and resource utilization that will be needed during hospitalization by the GI bleed patient.
Keywords: Anticoagulation, blood product, gastrointestinal bleed, lactate, mortality
Warfarin Use Requiring Intervention in the Emergency Department: Acute Hemorrhage Concerns
Charles V. Pollack, Jr.1,2, W. Frank Peacock3, Alex Frost4, Steven H. Silber5, Richard A. Bernstein6, Geno Merli1, Babak S. Jahromi6, James D. Douketis7, Todd C. Villines8, Gregory J. Fermann9, Gregory J. Fiore10, John Fanikos11
1Department of Emergency Medicine, Thomas Jefferson University, Philadelphia, PA, 2Hospital Quality Foundation, Shrewsbury, NJ, 3Baylor College of Medicine, Houston, TX, 4Studymaker LLC, Boston, 11Brigham and Women's Hospital, Boston, 10Fiore Healthcare Advisors, Inc, Cambridge, MA, 5New York Methodist Hospital, Brooklyn, NY, 6Northwestern University, Chicago, IL, 7McMaster University, Hamilton, ON, 8Uniformed Services University of the Health Sciences, Bethesda, MD, 9University of Cincinnati, Cincinnati, OH, USA
Background: The objective of the study is to characterize the clinical and economic impact of clinicians' responses to major bleeding complications and preprocedural concerns for bleeding risk in patients treated with oral anticoagulants (OAC) who present to the emergency department (ED) or in the hospital with acute illness or injury. Methods: Multicenter (31 US hospitals) observational registry of patients on OAC (warfarin, anti-Xa orals [apixaban, edoxaban, rivaroxaban; betrixaban], and anti-IIa orals [dabigatran]) who present with acute illness or injury. Inclusion criteria were hemorrhage or bleeding concern attributable to OAC therapy that requires transfusion, invasive management, or sophisticated/lengthy diagnostic exploration that would otherwise not be needed. More specified and detailed inclusion criteria were followed. Results: A total of 564 subjects were enrolled between August 8, 2016, and January 30, 2018, the mean (standard deviation) age was 70.6 (13.1) years, and 53% were male. There was concomitant aspirin use in 46.5% and concomitant P2Y12 agent in 8.4%. 85.1% were hospitalized, 32.2% were in intensive care unit (ICU). Those managed as outpatients had either lengthy ED stays or were placed under observation. 405 (71%) were enrolled for acute hemorrhage. 159 (29%) were enrolled for a warfarin-associated bleeding concern. On observation, intravenous (IV) Vitamin K was administered to <75% of eligible patients, and of 129 patients receiving prothrombin complex concentrate (PCC), only two received three-factor treatment. Fresh frozen plasma (FFP) is still frequently given in the setting of warfarin-associated hemorrhage and bleeding concern with elevated international normalized ratio. Bleeding concern patients had higher mortality (6.3%) than acute hemorrhage patients (0.002% overall, 1.1% for intracranial hemorrhage) but were also older with more comorbidities. Conclusion: OAC at ED presentation can complicate and prolong care. Warfarin-related hemorrhage and bleeding complications are often associated with advanced age, comorbidities, and concomitant antiplatelets. Four-factor PCC is used much more than three-factor PCC, but there is still a great deal of FFP use for these patients. IV Vitamin K, while not a true reversal agent, is underutilized in these warfarin-treated patients. Bleeding concern has a similar impact on intensity of care as acute hemorrhage among warfarin-treated patients.
Keywords: Bleeding, hemorrhage, oral anticoagulant, Vitamin K
Cognitive Deficits in African Americans with Diabetes in the Emergency Department
Kristin L. Rising, Barry W. Rovner, Robin. J. Casten, Judd Hollander, Megan Kelley, Hailey Shughart, Melanie Chalfin, Giacomo Pazzaglia, Anna Marie Chang
Thomas Jefferson University, Philadelphia, PA, USA
Background: African Americans with diabetes (DM) seek emergency department (ED) care twice as often as Whites. About 40% of African Americans with DM have at least one visit to an ED annually, and 24% use the ED as a usual place of care compared to 13% of Whites. DM impairs cognition, and resulting cognitive deficits may compromise understanding and reporting of symptoms in the ED, as well as understanding and execution of discharge instructions. The purpose of this study is to describe the nature and extent of cognitive deficits among African American patients with DM seeking ED care. Methods: This is a cross-sectional study conducted at Thomas Jefferson University Hospital (TJUH) in Philadelphia. Patient population: African American adults (>35 years old), DM type 2, no prior history of dementia, sought care in the TJUH ED, and consented to participation. Comprehensive clinical assessments were conducted, including cognitive testing using the short Montreal Cognitive Assessment (s-MoCA), which assesses memory, executive function, and language. Scores range from 0 to 16, and the higher scores indicate better function. Relationships between continuous variables were examined using Pearson product–moment correlations (e.g., age, education, s-MoCA scores) and relationships among categorical variables (sex) using analysis of variance. Results: A total of 105 patients were enrolled. Mean age was 67.5 years (standard deviation [SD] 6.2; range 60–91), 80% (84/105) were women, mean s-MoCA score was 9.81 (SD 2.50; range 2–14), and s-MoCA negatively correlated with age (r = −0.307) and positively correlated with education (r = 0.349) and literacy (r = 0.531) (all P ≤ 0.001). Conclusion: Many older African Americans with DM who are seen in EDs have potentially unrecognized cognitive deficits. These deficits should be considered when ED staff obtains histories of medical events leading to ED visits and when providing discharge instructions to ensure understanding and adherence. Further, deficits should be considered when determining patients' ability to manage their DM. Consideration should be made to implementing routine cognitive assessments for high-risk populations in the ED. Future studies should confirm the nature and extent of cognitive deficits in this population and determine whether cognitive function predicts clinical outcomes and return ED visits.
Keywords: Cognitive deficits, diabetes, emergency department
Gastrointestinal Bleed in the Emergency Department: Lactate versus Alternative Predictors of Mortality
Bryan F. Imhoff, Brandon Ricke, Seth Atchison, Alexandria M. Larson, Samantha J. DeArmon, Daniel Kolm, Nicholas P. Dodson, Edric K. Wong, Kelly Howe, Niaman Nazir, Lucas Lemar, Chad M. Cannon
University of Kansas Medical Center, Kansas City, Kansas, USA
Introduction: Emergency department (ED) visits for acute gastrointestinal (GI) bleed account for approximately 20,000 annual United States deaths. Gauging the severity of bleed can be difficult. While there are multiple well-validated risk stratification tools available for GI bleeds, none are specific to the ED. This study compares lactate to other potential ED-relevant predictors of mortality. Methods: The study was a retrospective cohort chart review, spanning a 5-year period, conducted at an urban, tertiary, academic hospital. A total of 245 normotensive patients presented during the study period. Lactate levels and other data were obtained on adult patients who presented to the ED with a GI bleed. Hypotensive patients with GI bleed are high risk and therefore not difficult to risk-stratify. This study focused on normotensive (systolic blood pressure ≥90 mmHg) patients with GI bleed. The primary outcome of study was in-hospital mortality. Independent variables evaluated against mortality included Model for End-Stage Liver Disease (MELD) Score, history of dialysis, history of liver disease, initial presenting hemoglobin, shock index (SI), current anticoagulation use, current antiplatelet use, and age. Statistical evaluation was performed using SAS V9.4. [Table 1]. Conclusion: This study demonstrates an association between several variables including lactate, MELD score, and history of liver disease with increased mortality. Further, there is no statistically significant relationship between hemoglobin, SI, anticoagulant use, antiplatelet use, and age with mortality. This study suggests lactate is similar to predicting mortality in comparison to the MELD in the nonhypotensive ED GI bleed population.
Keywords: Emergency department, gastrointestinal bleed, lactate, mortality
American Association Guidelines for Ascites Management: Clinician Adherence
R. Isaacs, E. Aligholizadeh, R. G. Wilkerson
Department of Emergency Medicine, University of Maryland School of Medicine, Baltimore, MD, USA
Background: In 2012, the American Association for the Study of Liver Diseases (AASLD) released a practice guideline for the management of ascites. This study assesses compliance with the recommendations that pertain to acute management and would be of interest in improving the quality of care of these patients. Methods: A retrospective, single-center medical record study was performed on patients with a history of cirrhosis and ascites who underwent a paracentesis at a large, urban academic hospital from January 1, 2016, to June 30, 2018. Only the first paracentesis of the hospital visit was included. Transfers from outside hospitals were excluded. Data were collected regarding medical history, paracenteses performed, and documented management steps. These data were analyzed to determine clinician adherence to the AASLD guidelines and barriers to compliance. The AASLD recommendations most pertinent to acute care included bedside inoculation of ascitic fluid into blood culture bottles; albumin infusion of 6–8 g/l of fluid removed for paracenteses >5 l; empiric antibiotic therapy if polymorphonuclear (PMN) leukocyte counts are ≥250 cells/mm3; and a follow-up paracentesis after 48 h of treatment if PMN ≥250 cells/mm3 or if culture is positive for atypical organism(s) or if there is atypical response to treatment. Results: A total of 100 paracentesis procedures were performed in 100 unique patients and were included in the study. None (0%) of the procedures were all the guidelines of interest followed completely. The least followed recommendation was for bedside inoculation (1%). Of the 11 patients with paracenteses >5 l, 7 (63.6%) received albumin ≥6–8 g/l of ascitic fluid removed. All 9 (100%) patients with PMN ≥250 cells/mm3 received appropriate antibiotic therapy. Of those 9, 5 (55.6%) had a follow-up paracentesis performed. There was 14% mortality from liver-related complications during the index hospitalization. Conclusion: Overall, adherence to the AASLD guidelines for the care of patients with ascites due to cirrhosis is poor. Specifically, there is a lack of clear documentation of the bedside inoculation procedure for ascitic fluid into blood culture bottles. Further studies are indicated to determine how to improve adherence to this procedure and determine strategies to improve compliance.
Keywords: American association guidelines, ascites, ascites management, clinical adherence
Flumazenil in Zopiclone Overdose
Teo Zhongyang, Juliana Poh
Department of Emergency Medicine, Singapore General Hospital, Singapore
Background: Zopiclone is commonly prescribed for insomnia. It is a central nervous system (CNS) depressant classified as a cyclopyrrolone, a nonbenzodiazepine hypnotic agent with pharmacological properties, similar to benzodiazepines. Currently, there are no guidelines for the management of zopiclone overdose. Methods: We present a case of a 54-year-old female with history of depression, who presented with a Glasgow coma scale score of E3V1M5. Her family reported that she had overdosed on 20 tablets of zopiclone prescribed by her psychiatrist. Results: The patient was seen in the high acuity area. She was hemodynamically stable and was given supplemental oxygen with end-tidal capnography monitoring. Laboratory investigations including a full blood count, urea and electrolytes, liver function tests, and serum paracetamol and salicylate levels were performed. Intravenous flumazenil 0.2 mg was given over 10 s, and a significant improvement in her conscious level was observed within 30 s. She was able to report her name, date of birth, and the medication she had taken. However, the patient reverted to the somnolent state after 5 min. A second dose of intravenous flumazenil 0.2 mg was administered with a similar but prolonged result; she remained lucid for 10 min. She was subsequently admitted for the observation after toxicology consult. Conclusion: There are anecdotal reports of similar responses to flumazenil. Treatments of toxic overdoses generally focus on airway management, hemodynamic support, and generic measures such as activated charcoal or gastric lavage. Flumazenil is a competitive benzodiazepine receptor antagonist and is indicated for the reversal of the sedative effects of benzodiazepines. Z drugs such as zopiclone share gamma aminobutyric acid receptors with benzodiazepines. Zopiclone toxicity manifested as CNS depression and somnolence in this patient, and the transient response to flumazenil is consistent with the reversal of a drug with similar properties to benzodiazepines. Hence, treatment with flumazenil should be considered in patients who present similarly.
Keywords: Benzodiazepines, flumazenil, insomnia, Zopiclone overdose
Major Gastrointestinal Bleeding Treated with Four-Factor Prothrombin Complex Concentrate: Resource Utilization
Charles E. Mahan, Christina L. Wassel1, Julie A. Gayle1, Jill Dreyfus1, Kelly McNeil-Posey2, Belinda Lovelace2
University of New Mexico College of Pharmacy, Albuquerque, NM, 1Premier Healthcare Solutions, Charlotte, NC, 2Portola Pharmaceuticals, South San Francisco, CA, USA
Background: Oral factor Xa inhibitors (oFXai) can lead to life-threatening complications such as major gastrointestinal bleeding (GIB). Four-factor prothrombin complex concentrate (4F-PCC) was developed as an antidote to warfarin in the presence of an acute major bleed. Although 4F-PCC has been used off-label for the reversal of oFXai, it does not directly target factor Xa and its efficacy in this setting remains unproven. This study examined the healthcare resource utilization (HRU) in the setting of 4F-PCC treatment for oFXai-related GIB hospitalizations in the Premier Healthcare Database (PHD). Methods: Adults with atrial fibrillation or venous thromboembolism, prior oFXai medication use, and 4F-PCC administration during a hospitalization for GIB were identified in the PHD (January 2014–December 2018). Descriptive statistics were used to assess readmissions, length of stay (LOS), intensive care unit (ICU) LOS, and costs. Results: During the study period, 1846 patients with GIB were identified. The mean (±standard deviation [SD]) age was 74 (±12) years, and 82% had Medicare as a primary payer. Of these, 19%, 29%, and 36% were readmitted by 30, 60, and 180 days, respectively. The mean LOS and ICU LOS were 9.4 and 5.1 days. Mean (±SD) total costs for hospitalization were $38,939 (±49,978), including cost by department in ICU $44,764 (±406,876) and pharmacy $20,790 (±340,467). Conclusion: To date, this is the first study examining hospital-level HRU of 4F-PCC treated oFXai-related GIB in a United States hospital discharge database. Readmissions, LOS and ICU LOS, and total costs suggest a high burden to hospitals in managing life-threatening bleeds.
Keywords: Four-factor prothrombin complex concentrate, gastrointestinal bleed, healthcare cost, oral factor Xa inhibitors
Major Traumatic Bleeds Treated with Four-Factor Prothrombin Complex Concentrate: Resource Utilization
Charles E. Mahan, Christina L. Wassel1, Julie A. Gayle1, Jill Dreyfus1, Anne Beaubrun2, Belinda Lovelace2
University of New Mexico College of Pharmacy, Albuquerque, NM, 1Premier Healthcare Solutions, Charlotte, NC, 2Portola Pharmaceuticals, South San Francisco, CA, USA
Background: Oral factor Xa inhibitors (oFXaiS) can lead to severe complications such as intracranial hemorrhage, which are often trauma-related. Four-factor prothrombin complex concentrate (4F-PCC) was developed as an antidote to warfarin in the presence of an acute major bleed. Although 4F-PCC has been used off-label for reversal of oFXai, it does not directly target factor Xa and has no relevant impact on anti-factor Xa levels and its efficacy in this setting remains unproven. This study examined the healthcare resource utilization (HRU) for traumatic bleeds in the setting of 4F-PCC management of oFXai-related major bleeding-related hospitalizations in the Premier Healthcare Database (PHD). Methods: Adults with atrial fibrillation or venous thromboembolism, prior oFXai medication use, and 4F-PCC administration during a hospitalization for major bleeds due to trauma were identified in the PHD (January 2014–December 2018), a United States hospital database. Descriptive statistics were used to assess costs, length of stay (LOS), intensive care unit (ICU) LOS, and readmissions. Results: During the study period, 3762 discharges associated with oFXai-related major bleeds were identified, with 151 due to trauma. Of these 151 trauma-related bleeds, the mean (±standard deviation [SD]) age was 76 (±14) years and 81% had Medicare as a primary payer. Mean (±SD) total costs for hospitalization were $41,443 (±46,290), with ICU costs of $34,643 (±41,120) and 4F-PCC costs of $7,675 (±9,829). Conclusion: To date, this is the first study examining hospital level HRU of 4F-PCC treated oFXai-related major trauma-related bleeds in a US hospital discharge database. Costs, LOS and ICU LOS, and readmissions suggest a high burden to hospitals in managing life-threatening bleeds.
Keywords: Four-factor prothrombin complex concentrate, healthcare cost, oral factor Xa inhibitors, traumatic bleed
Patients with Severe Illness and Impaired Functional Status: Initial Effects of Methylnaltrexone
W. Frank Peacock, Neal E. Slatkin1,2, Robert J. Israel3, Nancy Stambler4
Baylor College of Medicine, Houston, TX, 1University of California Riverside, School of Medicine, Riverside, CA, 2Salix Pharmaceuticals, 3Bausch Health US, LLC, Bridgewater, NJ, 4Progenics Pharmaceuticals, Inc., New York, NY, USA
Background: Opioid-induced constipation (OIC) is commonly seen in the emergency department (ED) in patients receiving opioids for pain. We studied the efficacy and safety of a single methylnaltrexone (MNTX) dose for OIC in advanced illness patients who were refractory to laxatives and who had varying degrees of baseline functional status. Methods: This post hoc analysis pooled data from three placebo-controlled, double-blind studies. Study 1 (NCT00401362) compared subcutaneous MNTX 0.15 or 0.30 mg/kg versus placebo, study 2 (NCT00402038) compared subcutaneous MNTX 0.15 mg/kg versus placebo, and study 3 (NCT00672477) compared body weight‒based subcutaneous MNTX 8 mg (38–<62 kg) or 12 mg (362 kg) versus placebo. Baseline demographics and disease/treatment characteristics were evaluated. End points were rescue-free laxation (RFL) within 4 or 24 h after the first dose based on the baseline patient functional level (assessed by the Eastern Cooperative Oncology Group [ECOG] or the World Health Organization [WHO] performance status scales), time to RFL, pain intensity, and treatment-emergent adverse events (TEAEs). Results: There were 518 patients (MNTX = 281; placebo = 237). Mean (standard deviation) age was 66.2 (13.8) years for MNTX and 65.8 (14.4) years for placebo; 50% were men. The most frequent primary diagnosis was cancer (MNTX = 70.5%; placebo = 66.2%). A single dose of MNTX increased RFL rates (95% confidence interval for the proportion) within 4 h regardless of baseline ECOG/WHO status (ECOG/WHO 0, 1, 2: placebo = 20% (12.3, 28.1) vs. MNTX = 65.5% (56.6, 74.3); ECOG/WHO 3, 4: placebo = 13% [7.4,18.7] vs. MNTX = 58.8% (51.4, 66.2)). Relative to placebo, MNTX use reduced the estimated time to first RFL, with most MNTX-treated patients estimated to achieve an RFL within 2 h. No changes in predose to postdose pain intensity scores were detected. The percentage of patients with at least 1 TEAE declined from day 1 (placebo = 20.7%; MNTX = 41.6%) to day 2 (placebo = 10.8%; MNTX = 12.3%); most TEAEs were gastrointestinal (GI) in nature. Conclusion: A single dose of MNTX is effective and safe for rapid relief of OIC in laxative-refractory advanced-illness patients with varying degrees of functional status who may present in the ED. MNTX does not affect the efficacy of opioid analgesia and is well tolerated with transient GI TEAEs, most of which may be associated with laxation.
Keywords: advanced illness, constipation, methylnaltrexone, opioid-induced constipation
The Safety of Interrogating Pacemakers and Defibrillators with Mismatched Read-Only Interrogators
Tinh Le, James Neuenschwander1, Mackenzie Sankoe2, Parris Miller2, Hana Le2, Parker Cordial3, Coulter Wilson2, Kaitlyn Cedoz4, Brian Hiestand5, W. F. Peacock6
Case Western Reserve University School of Medicine, Cleveland, 1Genesis Healthcare Systems, Zanesville, 2The Ohio State University, Columbus, OH, 3Johns Hopkins University, Baltimore, MD, 4University of Alabama School of Medicine, Birmingham, AL, 5Wake Forest School of Medicine, Winston Salem, NC, 6Baylor College of Medicine, Houston, TX, USA
Background: Concern exists that interrogating pacemakers and internal cardioverter defibrillators (ICDs), also known as cardiac implantable electronic devices (CIEDs), with the wrong or “mismatched” read-only interrogator may cause device malfunction. Our objective was to determine if intentionally mismatching a specific company's interrogator to the mismatched manufacturer's CIED could result in CIED malfunction. Methods: Nonimplanted CIEDs were interrogated from each of the three major device manufacturers: Abbott Laboratories (Lake Bluff, IL), Boston Scientific (Marlboro, MA), and Medtronic Plc (Minneapolis, MN). A mix of new and old pacemakers and ICDs were utilized to reproduce the cohort of devices that an emergency physician might encounter. Devices were interrogated for 2 min with a mismatched read-only interrogator and then evaluated with the correct programmer to identify any induced malfunction. This cycle was repeated with the other mismatched interrogator. Evaluation of implanted CIEDs followed. Consenting patients at an elective electrophysiology clinic evaluation were evaluated by a single mismatched interrogator (randomized as to which mismatch was used) for 2 min. CIED function was then evaluated by clinic staff, blinded to the mismatched interrogator used. Results: A total of 465 total CIED interrogations were performed. A total of 180 total mismatched interrogations were performed, and these were 75 nonimplanted devices (25 for each manufacturer) with a total of 150 and 30 implanted devices (10 from each manufacturer). Patients' characteristics were as follows: average age: 71.6 years (14.7), 16 patients were male (53%), 24 (80%) pacemakers, 4 (13%) combo pacemaker/ICD, and 2 (7%) ICD only. 0/180 interrogations (8%, 97.5% confidence interval 0%–2%) caused device malfunction. Conclusion: We found no evidence of device malfunction following CIED interrogation with mismatched read-only interrogators. This suggests that interrogating a CIED of unknown brand with any read-only interrogator is highly unlikely to result in patient harm. This knowledge can be utilized to increase the efficacy of care provided to CIED patients with unknown CIEDs.
Keywords: Cardioverter-defibrillator, implantable electronic device, interrogator, mismatched interrogator, pacemaker
Evaluating Educational Models for Teaching Loop Drainage Technique
Lauren Rosenblatt, Samantha King, Michele Callahan1, R. Gentry Wilkerson1
Department of Emergency Medicine, University of Maryland Medical Center, 1Department of Emergency Medicine, University of Maryland School of Medicine, Baltimore, MD, USA
Background: Emergency department (ED) visits for cutaneous abscesses are increasing. It is therefore important for healthcare providers to be proficient in identifying and managing them, including the various techniques such as needle aspiration, incision and drainage, and loop drainage technique (LDT). LDT is relatively new with limited resources available to learn how it is performed. A number of abscess simulators have been described in the literature; however, none have been compared with regard to cost-effectiveness, feasibility, and clinical applicability for LDT. The objective of the current study was to compare three different simulation models to determine the model with the highest fidelity for learning LDT. Methods: This was a prospective survey study of a convenience sample of emergency medicine residents at a large urban academic center. Residents volunteered to participate during a quarterly scheduled cadaver and simulation session. After a short, self-directed review of the LDT using a prepared informational sheet, each volunteer performed the LDT on a simulated abscess on a cadaver, a commercial abscess pad, and a homemade gelatin mold. Participants completed pre- and post-study surveys to assess each model for realism, ease of use, and overall preference for learning the LDT. Results: For the study, 28 of a total of 57 residents were available and participated in the 1-day simulation. The majority (57.1%, P < 0.01) found the cadaver to be the preferred model for learning the LDT and reported it to have the most realistic physical examination for an abscess (P < 0.01). No model was superior in simulating ultrasound findings. Before participation, 0 residents felt proficient (Likert ranking 5 out of 5) in performing LDT and only 14.2% would use LDT. However, on the post-survey, 46.4% of the residents felt proficient and 78.6% reported that they would use LDT (P < 0.01). Conclusions: Simulation is an effective educational tool, both for learning new skills and improving upon pre-existing procedural competency. Residents found that cadavers provided the most realistic physical examination findings, and the majority felt that the use of a cadaveric-based model was the preferred avenue to learn the LDT. However, cadavers are not always available or affordable, an important factor when considering feasibility in various educational settings.
Keywords: Cutaneous abscess, loop drainage, simulation model
Dyspnea in the Emergency Room: Contribution of the Transthoracic Echography
I. Gueye, S. Jidane, M. Lekhlit, T. Nebhani, N. Chouaib, A. Belkouch, S. Zidouh, L. Belyamani
Inter Army Medical Center of Ziguinchor, Senegal
Introduction: Acute dyspnea is a complex symptom and a reason for frequent consultation in the emergency room. A meticulous examination, a bilateral, symmetrical, complete, comparative clinical examination, and an additional biological and morphological examination allow a clinician to identify the etiology. Methods: We carried out a prospective observational study during the period from May 1 to July 1, 2019. We used a Chison-type ultrasound machine. The data were analyzed using SPSS 23 software. It was therefore a descriptive study. The analysis of the correlations between, on the one hand, nominal variables of transthoracic echocardiography and, on the other hand, clinical-biological parameters was also carried out by taking as a control variable the beginner or senior status of the operator. Results: The sex ratio is 1.65. The overall average age is 68.6 years with a standard deviation of 13.9 years. Cardiac causes are dominated by decompensation of left ventricular failure with 26.6% of cases. Acute coronary syndromes represent 3% of cases. For pleuropulmonary causes, we first note the severe hypoxemic pneumonia with 23.5% of cases followed by decompensation of chronic obstructive pulmonary disease with 11.3% of cases. We note a strong correlation between the transthoracic ultrasound and the clinical-biological parameters studied. Conclusion: It emerges from this study that transthoracic ultrasound can be a decisive contribution to the etiological approach in patients admitted for acute dyspnea, regardless of the beginner's or confirmed operator's status.
Keywords: Dyspnea, emergency room, transthoracic echography
Dr. Sukaina Ali Alali
Baylor College of Medicine, Houston, TX
Source of Support: None, Conflict of Interest: None
[Table 1], [Table 2]