Mar 272015
 

The Negative Prognostic Impact of a First Ever Episode of Spontaneous Bacterial Peritonitis in Cirrhosis and Ascites.

J Clin Gastroenterol. 2015 Mar 24;

Authors: Ra G, Tsien C, Renner EL, Wong FS

Abstract
BACKGROUND: The prognostic impact of the first ever episode of spontaneous bacterial peritonitis (SBP) on patient outcomes is not well described. Our aim was to compare the clinical outcomes of cirrhotic patients with ascites, and with or without a first episode of SBP.
METHODS: Consecutive patients with cirrhosis and ascites were prospectively enrolled. Demographics, liver and renal function, and hemodynamics were documented at baseline, at resolution of SBP, and thereafter at 4 monthly intervals for 12 months. Complications of cirrhosis and survival were noted.
RESULTS: Twenty-nine cirrhotic patients with a first ever episode of SBP (group A) and 123 control patients slightly younger but similar in gender who never had SBP (group B) were enrolled. At SBP diagnosis, group A had worse liver and renal function (Model of End-Stage Liver Disease : 21.1±10.6 vs. 14.4±5.0), lower serum sodium concentrations, and a more hyperdynamic circulation compared with group B (all P<0.001). SBP resolution resulted in improvement in all measures to baseline levels. During follow-up, group A required more frequent hospital admissions than group B (58% vs. 43%), developed more cirrhotic complications, including further SBP (31% vs. 3%*), hyponatremia (12% vs. 0.8%*), acute kidney injury (50% vs. 23%*), hepatorenal syndrome type 1 (46% vs. 7%*), liver transplantation (62% vs. 30%*), and had a worse overall 1-year survival (38% vs. 70%*) (*P<0.05).
CONCLUSIONS: A first SBP episode is commonly followed by multiple complications, and overall worse prognosis. Consideration should be given to assess cirrhotic patients for liver transplant after the first episode of SBP.

PMID: 25811112 [PubMed - as supplied by publisher]

Link to Article at PubMed

Share


Mar 272015
 

Update on the Diagnosis and Management of Helicobacter pylori Infection in Adults.

J Clin Gastroenterol. 2015 Mar 24;

Authors: Patel KA, Howden CW

Abstract
Treatment of Helicobacter pylori infection is becoming increasingly challenging due largely to the rising rates of antimicrobial resistance and to the relative complexity of treatment regimens. If a reliable test to assess the antimicrobial sensitivity/resistance of H. pylori was readily available, treatment would be more focused and-presumably-more effective. However, antimicrobial sensitivity testing is difficult to obtain in most parts of the United States. Therefore, physicians have to rely on clinical judgment in selecting treatment regimens for their infected patients. The aims of this review are to summarize recent treatment recommendations and to examine available evidence for how we might improve on our current treatment selections. Information on this review is directed primarily toward physicians practicing in the United States.

PMID: 25811119 [PubMed - as supplied by publisher]

Link to Article at PubMed

Share


Mar 272015
 

Predicting Risk of Endocarditis Using a Clinical Tool (PREDICT): Scoring System to Guide Use of Echocardiography in the Management of Staphylococcus aureus Bacteremia.

Clin Infect Dis. 2015 Mar 25;

Authors: Palraj BR, Baddour LM, Hess EP, Steckelberg JM, Wilson WR, Lahr BD, Sohail MR

Abstract
BACKGROUND:  Infective endocarditis (IE) is a serious complication of Staphylococcus aureus bacteremia (SAB). There is limited clinical evidence to guide use of echocardiography in the management of SAB cases.
METHODS:  Baseline and 12-week follow-up data of all adults hospitalized at our institution with SAB from 2006 to 2011 were reviewed. Clinical predictors of IE were identified using multivariable logistic regression analysis.
RESULTS:  Of the 757 patients screened, 678 individuals with SAB (24% community-acquired, 56% healthcare-associated and 20% nosocomial) met study criteria. Eighty-five patients (13%) were diagnosed with definite IE within the 12 weeks of initial presentation based on modified Duke criteria. The proportion of patients with IE was 22% (36/166) in community-acquired SAB, 11% (40/378) in community-onset healthcare associated and 7% (9/136) in nosocomial SAB. Community-acquired- SAB, presence of cardiac device, and prolonged bacteremia (≥ 72h) were identified as independent predictors of IE in multivariable analysis. Two scoring systems; "Day 1" (SAB diagnosis day) and "Day 5" (when day 3 culture results are known) were derived based on the presence of these risk factors, weighted in magnitude by the corresponding regression coefficients. A score of≥4 for Day 1 model had a specificity of 96% and sensitivity of 21% whereas a score of <2 for Day 5 model had a sensitivity of 98.8% and negative predictive value (NPV) of 98.5%.
CONCLUSIONS:  We propose two novel scoring systems to guide use of echocardiography in SAB cases. Larger prospective studies are needed to validate the classification performance of these scoring systems.

PMID: 25810284 [PubMed - as supplied by publisher]

Link to Article at PubMed

Share


Mar 272015
 

A multifaceted hospitalist quality improvement intervention: Decreased frequency of common labs.

J Hosp Med. 2015 Mar 21;

Authors: Corson AH, Fan VS, White T, Sullivan SD, Asakura K, Myint M, Dale CR

Abstract
PURPOSE: Common labs such as a daily complete blood count or a daily basic metabolic panel represent possible waste and have been targeted by professional societies and the Choosing Wisely campaign for critical evaluation. We undertook a multifaceted quality-improvement (QI) intervention in a large community hospitalist group to decrease unnecessary common labs.
METHODS: The QI intervention was composed of academic detailing, audit and feedback, and transparent reporting of the frequency with which common labs were ordered as daily within the hospitalist group. We performed a pre-post analysis, comparing a cohort of patients during the 10-month baseline period before the QI intervention and the 7-month intervention period. Demographic and clinical data were collected from the electronic medical record. The primary endpoint was number of common labs ordered per patient-day as estimated by a clustered multivariable linear regression model clustering by ordering hospitalist. Secondary endpoints included length of stay, hospital mortality, 30-day readmission, blood transfusion, amount of blood transfused, and laboratory cost per patient.
RESULTS: The baseline (n = 7824) and intervention (n = 5759) cohorts were similar in their demographics, though the distribution of primary discharge diagnosis-related groups differed. At baseline, a mean of 2.06 (standard deviation 1.40) common labs were ordered per patient-day. Adjusting for age, sex, and principle discharge diagnosis, the number of common labs ordered per patient-day decreased by 0.22 (10.7%) during the intervention period compared to baseline (95% confidence interval [CI], 0.34 to 0.11; P < 0.01). There were nonsignificant reductions in hospital mortality in the intervention period compared to baseline (2.2% vs 1.8%, P = 0.1) as well as volume of blood transfused in patients who received a transfusion (127.2 mL decrease; 95% CI, -257.9 to 3.6; P = 0.06). No effect was seen on length of stay or readmission rate. The intervention decreased hospital direct costs by an estimated $16.19 per admission or $151,682 annualized (95% CI, $119,746 to $187,618).
CONCLUSION: Implementation of a multifaceted QI intervention within a community-based hospitalist group was associated with a significant, but modest, decrease in the number of ordered lab tests and hospital costs. No effect was seen on hospital length of stay, mortality, or readmission rate. This intervention suggests that a community-based hospitalist QI intervention focused on daily labs can be effective in safely reducing healthcare waste without compromising quality of care. Journal of Hospital Medicine 2015. © 2015 Society of Hospital Medicine.

PMID: 25809958 [PubMed - as supplied by publisher]

Link to Article at PubMed

Share


Mar 272015
 

Medical management of patients on clozapine: A guide for internists.

J Hosp Med. 2015 Mar 23;

Authors: Lundblad W, Azzam PN, Gopalan P, Ross CA, PharmD

Abstract
Clozapine was approved by the US Food and Drug Administration in 1989 for the management of treatment-resistant schizophrenia, and has since proven to reduce symptom burden and suicide risk, increase quality of life, and reduce substance use in individuals with psychotic disorders. Nevertheless, clozapine's psychiatric benefits have been matched by its adverse effect profile. Because they are likely to encounter medical complications of clozapine during admissions or consultations for other services, hospitalists are compelled to maintain an appreciation for these iatrogenic conditions. The authors outline common (eg, constipation, sialorrhea, weight gain) and serious (eg, agranulocytosis, seizures, myocarditis) medical complications of clozapine treatment, with internist-targeted recommendations for management, including indications for clozapine discontinuation. Journal of Hospital Medicine 2015. © 2015 Society of Hospital Medicine.

PMID: 25809850 [PubMed - as supplied by publisher]

Link to Article at PubMed

Share


Mar 272015
 

Meta-analysis of Randomized Controlled Trials of Genotype-guided versus Standard Dosing of Warfarin.

Chest. 2015 Mar 26;

Authors: Dahal K, Sharma SP, Fung E, Lee J, Moore JH, Unterborn JN, Williams SM

Abstract
Introduction: Warfarin is widely prescribed anticoagulant and its effect depends on various patient factors including genotypes. Randomized controlled trials (RCTs) comparing genotype-guided dosing (GD) of warfarin with standard dosing (SD) have shown mixed efficacy and safety outcomes. We performed a meta-analysis of all published RCTs comparing GD versus SD in adult patients with various indications of warfarin use.
Methods: We searched MEDLINE, EMBASE, Cochrane CENTRAL and relevant references for English language RCTs (inception through March 2014). We performed the meta-analysis using random effects model.
Results: 10 RCTs with a total of 2505 patients were included in the meta-analysis. GD compared to SD resulted in similar % time in therapeutic range (TTR) at ≤1 month follow-up [44% vs. 44.5%, mean difference (MD): -0.52, 95% confidence interval (CI): (-3.15-2.10), P=0.70] and higher %TTR [62.9% vs. 56.6%, MD: 6.35 (1.76-10.95), p= 0.007] at >1 month follow up, a trend towards lower risk of major bleeding [risk ratio (RR): 0.46 (0.19-0.1.11) , p= 0.08] at ≤1 month follow-up and lower risks of major bleeding [0.34 (0.16-0.74), p= 0.006] at >1-month follow-up, and shorter time to maintenance dose (TMD) [23 days vs. 33 days, MD: -9.54 days (-18.10, -0.98), P=0.03] at follow-up but had no effects on INR > 4.0, non-major bleeding, thrombotic outcomes or overall mortality.
Conclusion: In the first month of genotype-guided warfarin therapy, compared to standard dosing, there were no improvements in %TTR, INR > 4.0, major or minor bleeding, thromboembolism, or all-cause mortality. There was shorter TMD, and after one month, improved %TTR and major bleeding incidence, making this a cost-effective strategy in patients requiring longer anticoagulation.

PMID: 25811981 [PubMed - as supplied by publisher]

Link to Article at PubMed

Share