Back to the Top
I would like the opinion of practitioners and research people who are
dosing cystic fibrosis patients with aminoglycocydes. My main area of
interest is conventional dosing vrs extended interval dosing.
At present we are following the guidelines from the CHEST 1997;
112:1208-13 article.
Though we have modified the concept somewhat. These modifications and
our protocol forsampling of levels causes me some concern as a
practicing pharmacist.
Modifications:
We start all cystic fibrosis patients at 10 mg/kg/day.
We sample 10 hours after the start of the infusion and the infusion
runs for 1 hour
We use this level and evaluate it against the Hartford nomogram for 7
mg/kg/day
From here we start to have highly variable pharmacist behavior my
concern.
If the level is elevated we keep the dose and extend the interval from
q24h to q36h
some pharmacist will draw another level with the next dose some do
nothing or only
watch the Sr Cr
If the level is low or undetected we will go to 15 mg/kg/day with the
starting q24h interval
Same issue with additional levels some the next dose some nothing.
While I have been using extended interval dosing in the usual patient
population for a number ofyears now and feel very comfortable with the
concept, these patients present us with very different challenges and
concerns for safety. My modeling is done with Rick Tharp R.Ph. software
called Kinetics.
I have included a sample patient to demonstrate my concern.
pt age 39 ht 163 cm wt 37 kg dosing wt 37 kg Sr Cr 0.3 (rph used 0.9 to
model)
first dose 370 mg IVPB q24h
10 hr level after first dose 0.6 mg/dl
2nd dose increase to 500 mg IVPB q24h
10 hr level after second dose 1.0 mg/dl
My question is very open ended but my feeling is we should draw Peak’s
and Trough’s with these patients and model with the second dose on a
q12h dose interval. I do not have a problem with the high dosing in
these patients. Thank You Dwight Norris R.Ph.
Back to the Top
We have utilized a dose of 10mg/kg qd in selected patients in our adult
CF
population, particularly in those patients who we plan to send home
early.
For monitoring we obtain two levels; a post-peak level (2 hours after
start
of infusion to avoid the prolonged distribution phase) and a random
level
drawn 8 to 12 hours after the dose. Two levels enables determination of
the
actual peak as well as the AUC; therefore allowing dosage
individualization. The target peak is approximately 20 mg/L (10 times
the
breakpoint for susceptibility) and the AUC is 85 to 100 (which provides
a
similar level of exposure as multiple daily dosing). The AUC can be
easily
determined by dividing the 24-hr dose by the revised aminoglycoside
clearance. This method was originally described by Barclay and Begg and
has
been applied to treatment of acute pulmonary exacerbations in patients
with
CF [Hopkins et al.]. Since most patients with CF exhibit aminoglycosides
half lives of 2 to 3 hours or less, trough concentrations in most
instances
will be undetectable by design and are therefore not optimal for
individualizing the dosing regimen.
The Hartford nomogram as you have identified was developed to
facilitate dosage
adjustment following a dose of 7mg/kg not 10 or 15 mg/kg. Since you
have the
software capability it is certainly possible for you to obtain levels
as described
above, determine the individualized PK parameters using MAP Bayesian
analysis
and revise the dosing regimen to achieve the desired goals (Peak/MIC,
AUC etc.)
The use of computer software is also beneficial since you are not
restricted to the
assumptions of a one compartment model (instantaneous distribution)
which is a
potential complication with high dose aminoglycoside therapies.
Barclay M, Duffull, SB, Begg, EJ, Buttimore, RC. Experience of
once-daily
aminoglycoside dosing using a target area under the concentration-time
curve. Aust NZ J Med 1995; 25:230-5.
Hopkins PA, Bowler S, Rice-McDonald G, Rasiah R. High-dose intravenous
tobramycin determined by area under the curve is well tolerated in
cystic
fibrosis. Thirteenth Annual North American Cystic Fibrosis Conference.
Seattle, Washington. October, 1999 [abstract 343].
Beringer PM, Vinks A, Jelliffe R, Shapiro B. Pharmacokinetics of
tobramycin
in adults with cystic fibrosis: Implications for once-daily
administration.
Antimicrob Agents Chemother 2000;44:809-813.
Aminimanizani A, Beringer PM, Kang J, Tsang L, Jelliffe RW, Shapiro BJ.
Distribution and elimination of tobramycin administered in single or
multiple
daily doses in adult patients with cystic fibrosis.
J Antimicrob Chemother 2002;50:553-59
Hope this is helpful,
Paul Beringer
Paul Beringer, Pharm.D., BCPS, FASHP
Associate Professor of Clinical Pharmacy and Clinical Medicine
University of Southern California
1985 Zonal Avenue, Los Angeles, CA 90089-9121
Tel: 323-442-1402, eFax: 626-628-3024
Webpage: http://pharmacy.usc.edu/pd_lab
Back to the Top
Dear Dwight:
About aminoglycoside dosing. I do not know the Chest article.
But in the last several years (10-15) it appears that high peak,
extended interval, low trough strategies are at least as effective as
the older ones, in most (not all) patients. These strategies appear
best suited for middle aged patients with CCr 50-80, who get a
reasonable peak from such dosing strategies. The people who do not do
well are those with better renal function, who excrete the drug more
rapidly, and who cannot wait 24 hrs until the next dose, as the serum
cons are below the MIC for too long. On the other hand, patients with
impaired renal function will usually need longer dose intervals.
So the key issue has always been not what ritual of dosage
frequency to use, but what your therapeutic goals actually are - what
peak and trough and AUC you want to achieve for a particular individual
patient. Set your target goal for peak and trough, and then give the
proper dose to achieve these goals as often as the patient's renal
function permits. We use the USC*PACK software to do this.
This software also has a module to model the bacterial growth
and kill of an organism when it is presented with a particular profile
of serum concentrations resulting from a particular dosage regimen in a
particular patient. It uses the Zhi-Nightingale model, and has a value
for the rate constant for the growth of the organism in the absence of
the antibiotic, the shortest possible half-time of kill with the drug
(the Emax) the sigmoidicity coefficient of the Hill model used to
describe the kill, and the MIC (or the EC50) of the organism.
This model, while not realistic in certain aspects, is a very
useful "worst case" scenario model. The organisms are always in their
most virulent phase of growth (the logarithmic growth phase). Also, the
bugs do not get more resistant over time, as bugs do under pressure
from the antibiotic. What you can do here, is simply to estimate how
resistant you think they will get, and enter that MIC. Then that
resistance is present from the very beginning. Because of this, this
model is a very useful "worst case" model, and a regimen that kills
under these circumstances is very likely to be successful clinically.
This model has also correlated well with patient clinical behavior in
at least one clinical case we have seen.
As to monitoring AG therapy, or therapy in general, I do not
understand why people wait at all before getting serum samples. There
is clearly no need to wait for a steady state. That was required when
people were using linear regression on the logs of the concentrations.
There is no need for this any more, with Bayesian software and decent
models. Linear regression on logs of levels has simply been an obsolete
method for many years.
I would suggest that you get a peak sample, preferably out of
the opposite arm at the end of the 1/2 or 1 hour infusion (don't wait
1/2 hr after the end of the infusion for distribution to be complete -
that is another obsolete idea that went out with linear regression on
logs of levels, again, because that was the only kind of data that
obsolete method could handle). There are a number of general methods
for determining the best times to get serum samples. D- optimal design
is a good one, and it has been around for over 25 years. A good
reference is
D'Argenio DZ: Optimal sampling times for pharmacokinetic experiments. J
Pharmacokin. Biopharm. 9: 739-756, 1981.
So a good place to start monitoring is to get a peak level
(note the dose and sampling times) after the first infusion, and then
another sample when you think the concentration has fallen to about 1/3
(theoretically 36%) of the original peak. What this will do for you is
to give you data to obtain the most precise parameter estimates for
your Bayesian posterior individualized model of the behavior of the
drug in that patient. At this point, look at the patient, look at the
plot of the behavior of the drug, possibly reconsider your target
goal(s), and then compute the regimen to best (most precisely) achieve
your selected target(s).
Another thing. The estimation of CCr. When serum creatinine is
not stable, the little Jelliffe and Cockcroft-Gault formulas do not
help. You will need a method to estimate CCr that is based on the rate
of rise and fall of serum creatinine between 2 samples. You might look
at
Jelliffe R: Estimation of Creatinine Clearance in Patients with
Unstable Renal Function, without a Urine Specimen. Am. J. Nephrology,
22: 3200-324, 2002.
You can then see the CCr when each dose was given in the past,
and do a better job of tracking the changing behavior of the drug in
acutely ill, highly unstable patients. I think this is an important
source of variability that is often overlooked.
As to your feelings about getting both peaks and troughs, I
generally agree with you. I think we do not get enough information to
do our job as well as we could. I think we miss many opportunities
here. I would start with the first dose, and get a peak sample at the
end of the first IV infusion, or shortly after. I certainly think we
need to get peaks. The peak is the time at which the sensitivity of the
serum concentrations with respect to the Vd is greatest, to get us the
best estimate of the Vd.
I think we also need to get a second level, because the basic
PK model is ay least a 2 parameter model, (we need to know both Vol and
elimination rate) and that this can be either a trough, or one when the
expected concentration has fallen to about 1/3 of the peak, when the
sensitivity of the serum concentration with respect to the elimination
rate constant is greatest, to get us the best estimate of it. In your
case here, this would be at about 2.5 hours after the start of the
first infusion, as the patient's CCr is so high. This would be in line
with D-optimal design, as described in the above reference.
There is an optical illusion about the proper times to get
samples, and it derives from not considering the well-known D-optimal
design, and from the common practice of plotting data on semilog paper.
When one looks at data plotted that way, it "seems logical" to get a
sample after the bend in the curve, after distribution has been
complete and one is on the terminal phase of elimination, and then at
the trough. This is really not optimal in any real sense, and greatly
corrupts the estimation of the Vd. This very powerful illusion is
quickly dispelled if you look at the reference above, and if you look
at the data on a linear, rather than a semilog plot. Go to our web site
www.lapk.org. Click on teaching topics. Click on optimal strategies for
PK/PD studies and for patient monitoring. Look at the slides.
Also, you might look at the 2 refs below for a discussion of
monitoring strategies.
1.Jelliffe RW, Iglesias T, Hurst AK, Food KA, and Rodriguez J:
Individualising Gentamicin Dosage Regimens: A Comparative Review of
Selected Models, Data Fitting Methods, and Monitoring Strategies.
Clinical Pharmacokinetics 21: 461-478, 1991.
2.Jelliffe RW, Schumitzky A, Van Guilder M, Liu M, Hu L, Maire P,
Gomis P, Barbaut X, and Tahani B: Individualizing Drug Dosage Regimens:
Roles of Population Pharmacokinetic and Dynamic Models, Bayesian
Fitting, and Adaptive Control. Therapeutic Drug Monitoring, 15:
380-393, 1993.
As to your sample patient, his estimated CCr, assuming SCr =
0.4, the lowest our MM-USCPACK software will accept, is 178 ml/min/1.73
Sq meters BSA. On 370 q 24h. the predicted peak conc is 28.49 ug/ml,
with 95% confidence limits between 19.6 and 41.5 ug/ml. Trough
predictions are 0.09ug/ml., with very small conf limits. This is based
on our pop model of 634 patients studied by Pascal Maire and colleagues
in France. Ten hrs after the first dose, the serum predictions are 0.47
ug/ml, with 95% conf limits between 0.0 and 2.0 ug/ml. As to predicted
effect, assuming an MIC of 2.0 ug/ml, this regimen fails to kill the
bugs successfully, and they grow out uncontrollably after the 3rd dose
of 370 daily.
Now, if you choose to use q 12 h dosing for this patient, the
dose, for the same peak and trough goals, would be 360 mg q 12 h, the
peak would be 27.72 ug/ml and the trough 0.3, rising to 0.5 after the
4th dose (using this model with a peripheral compartment). Again, 95%
conf limits would be from 19.3 to 40.1 for the peak, and
0.0 to 1.5 for the trough. As to predicted effect, there appears
to be about a 98% kill after the 3rd dose. That is good, and favors the
q 12 h dosing strategy here. The planned duration of therapy probably
should not be over 10 days, to reduce the risk of toxicity.
So the initial predictions are close to what you stated and
found. The USC*PACK software is very flexible, and is designed to
handle a model with both central and peripheral compartments, and it
has the effect model. The newer, Windows based MM-USCPACK software uses
nonparametric population models, and designs dosage regimens which are
specifically developed to hit target goals with maximum precision
(minimum weighted squared error). It is in beta release now. It is what
was used here to discuss your case, and to provide the confidence
limits for the predicted concentrations. It can also develop dosage
regimens to achieve stated target goals in the peripheral compartment
as well, which is especially useful for digoxin, there the clinical
effect correlates much more closely with peripheral concentrations than
the serum concs.
Finally, there is still the question of "adaptive resistance".
This is an unresolved question, as far as I know. But because of this,
it may well be prudent to make your very first dose be the really high
one, for example, 500 mg in your case here, rather than waiting until
later. It may be better to be too high on the very first dose, get your
levels, and then adjust after that, based on your data. Putting it
another way, it may be better to set your initial target a bit too high
with your very first dose, get your levels (at least a pair - you
simply cannot know both Vd and clearance or kel with only 1 sample!),
make your Bayesian posterior individualized model, see the plots,
compare with your patients behavior and needs, and adjust appropriately
after that. Further, one cannot ever get good AUC estimates from only a
single level, even with MAP Bayesian fitting. The quality of results
from such very sparse sampling strategies is poor. D-optimal design
helps a lot.
Very best regards,
Roger Jelliffe
Roger W. Jelliffe, M.D. Professor of Medicine,
Division of Geriatric Medicine,
Laboratory of Applied Pharmacokinetics,
USC Keck School of Medicine
2250 Alcazar St, Los Angeles CA 90033, USA
Phone (323)442-1300, fax (323)442-1302, email= jelliffe.aaa.usc.edu
Our web site= http://www.lapk.org
Back to the Top
The following message was posted to: PharmPK
Dear Dwight,
Your concern regarding the dosing is quite valid. We have come
across
a similar patient (young age and super low 10 hour random level,
although not SF patient) We were treating a Pseudomonas infection
only sensitive to amikacin.
My take on that: We can't use once daily aminoglycosides twice a
day
(I don't think safety data on such approach exists).
Neither we can assume that post antibiotic effect (PAE) will last
> 12
hours.(No one knows how long it really is)
Probably more reasonable approach to such problem is switching
these
patients to conventional dosing, and monitoring peaks and troughs.
This is a dilemma to which there is no right answer. One may
choose a
maximum once daily dose and just hope that PAE in vivo is longer
than
in vitro, Or one can switch to Conventional dosing and ensure
adequate
peak.
Sometimes in some patient populations aminoglycosides go into third
space, and that is why you have a low 10 hour level. One can check
a
post peak level to investigate if that is true. If it is, the
"door"
to third space may close quite rapidly, and you may be overdosing
aminoglycosides for a while before you detect it. With once daily
dosing this effect may be more difficult to reverse.
In general, using the normogram for 10 hour level in doses other
than
7 mg/kg can be dangerous, because AUC may not match, especially if
you
are increasing the interval. For these situations a trough
concentration will be a better guide for adjustment, but you can
still use 10 hour level if its sub MIC (Because you know that it
will
be sub MIC for the next 14 hours)
Alla Paskovaty, Pharm.D.
Clinical Coordinator/ Antibiotic Management
Memorial Sloan-Kettering Cancer Center
1275 York avenue Swartz 526
NY NY 10021
212-639-7212
Back to the Top
The following message was posted to: PharmPK
Dear Roger,
Thanks a lot for your excellent reply regarding dosing
regimen of aminoglycosides in CF.
I would like just to conclude a point and raise a
question:
1- To establish plasma monitoring during once daily
regimen of aminoglycosides, then two points of conc
could be drawn: a peak (post-infusion) and the second
point when peak conc is expected to fall 36 %) which
may be after 6 hours postinfusion taking in
consideration CCr.
2- The question is: What targets other than control of
infection during aminoglycosides therapy you mean in
your previous reply?.
Thanks for your clarification
Dr. Ehab EL Desoky
Pharmacology Dept
Faculty of Medicine
Assiut University
Assiut. Egypt
Back to the Top
Dear Ehab:
Thanks for your note. I think the important thing here is to
set these target goals individually for each patient. Having said that,
I think that most of the time, a peak target of 20-25 of gent of tob,
and a trough of 0.3, are pretty good most of the time. Then monitor
with a high level, close to the peak, and another one at about 1.5 half
times later, when the levels are predicted to be about 1/3 of the peak,
based on the patent's renal function. How does this sound to you?
Very best regards always,
Roger
Roger W. Jelliffe, M.D. Professor of Medicine,
Division of Geriatric Medicine,
Laboratory of Applied Pharmacokinetics,
USC Keck School of Medicine
2250 Alcazar St, Los Angeles CA 90033, USA
Phone (323)442-1300, fax (323)442-1302, email= jelliffe.-at-.usc.edu
Our web site= http://www.lapk.org
PharmPK Discussion List Archive Index page
Copyright 1995-2010 David W. A. Bourne (david@boomer.org)