Back to the Top
The following message was posted to: PharmPK
Is it acceptable to report values below LOQ to zero. Then used these
values (zero) in calculating mean of values at specific time point.
Assume LOQ is 10 ng/mL, and our value is 9.3 ng/mL. So is the practice
that this value will go to zero?
Please let me know
Regards,
Aqel
Back to the Top
The following message was posted to: PharmPK
I think the concept of LOQ is misleading. LOQ simply means that
concentration on the assay calibration curve at which the variance (or
standard deviation, reflecting precision) becomes too large by
convention.
The assay doesn't abruptly cease functioning at the LOQ. Therefore,
although there is probably some argument on this, I believe that the
data
you get is the data you get. If your assay indicates a concentration
of 9.3
ng/mL, then that is your best estimate of the concentration at that
time,
but with somewhat less precision than at earlier time points when the
concentration was higher. I think it introduces bias into your data to
simply change all sub-LOQ concentrations to 0.
Michael
Back to the Top
The following message was posted to: PharmPK
Dear Aqel,
from my point of view, when you report your individual data you should
use
"" when we
analyse a drug that it can't never be "0" as for example testosterone
levels in dogs.
I hope it helps.
Best regards,
Daniel Martínez
RIA Laboratory
Metabolism & Pharmacokinetics Service
Research & Development Department
IPSEN PHARMA, S.A.
Ctra. Laureŕ Miró 395
Sant Feliu de Llobregat, Barcelona, Spain
Teléf.: 936858100
daniel.martinez.aaa.beaufour-ipsen.com
Back to the Top
The following message was posted to: PharmPK
Dear all,
I had some trouble in the part regarding values under the LOQ.
In general the validate analytical method is "validate " between the
LLOQ and the ULOQ; a conseguenge is that no extrapolation can be
conduced.
You have some possibilities:
1-you could evaluate the percentage of AUC related to the sampling time
related to the concentration that your analytical method find as "and evaluate the entity
2-You can use an analytical escamotage that consit in adding a small
amount (let say 20 ng/ml) of the analyte into the sample in order to
quantify 20+X ng/ml , This possibility must be validate and
replication of samples is requested
3- Find a more sensitive analytical method
Best regards
Fabio Dr. Macchi
Back to the Top
The following message was posted to: PharmPK
Dear Aquel,
If you have a value say, 9.3 ng/ml, which isvalue. It is the best you can get, and with a proper variance function
taking into acount that the CV for the very low concentrations is larger
than for high concentrations, it can be used in the estimation of PK
parameters. To throw away a value recorded in the analysis and
replacing it
with some arbitrary number seems to me a rather peculiar thing to do.
Regards
Poul
Back to the Top
Most analytical labs get into trouble for exactly that reason. We, as
analytical services should not reprot out any numbers below LLOQ. WE
often do however at the behest of PK/PD scientists who "need to
know". They however forget and often incorporate this data into their
PK/PD calculations, an "forget" that they request this type of data.
Anything below the LLOQ should not be used. You can however request
that a lower LLOQ be set up and tested.
---
We often do not get a zero.
Back to the Top
The following message was posted to: PharmPK
Hi
I'm sure that there are several good repositories on this already.
eoconnor wrote:
> Anything below the LLOQ should not be used. You can however request
> that a lower LLOQ be set up and tested.
This is a fundamental difference of opinion between analytical
scientists and PKPD scientists. The PKPD scientists don't mind how
much error there is in the measurement as long as we have a measurement
to work with (we estimate the error as we model the data). There are a
number of methods (some better than others) for modelling LLOQ data
when you don't know the measurement value - but these are just an
approximation to knowing the measured value in order to avoid bias with
censoring or arbitrarily setting LLOQ to a fixed number.
Regards
Steve
Stephen Duffull
School of Pharmacy
University of Queensland
Brisbane 4072
Australia
Tel +61 7 3365 8808
Fax +61 7 3365 1688
http://www.uq.edu.au/pharmacy/duffull.htm
University Provider Number: 00025B
Back to the Top
The following message was posted to: PharmPK
The PK/PD scientists should remember that various curve fits are used
in bioanalytical measurements to fit the data between the LOQ and UOQ
and numbers below the LOQ may be biased based on the curve fit used.
Many times if you plug in 0 in the curve used to calculate the points
you will get a positive and sometimes a negative number because the
lines usually do not go through zero. So, it is not only the
variability below the LOQ that is the issue, the number itself may be
wrong (the number is not accurate). That is why analytical chemists do
not use these numbers and anyone using them should be very careful
interpreting any results below with LOQ, especially terminal half-lives.
Conversations with the FDA indicate that companies should not base
decisions on data below the LOQ. It is fine to report them as long as
they are clearly marked as below the LOQ. Studies have been rejected
because values below the LOQ were used and this was not clearly
indicated in the PK report.
I think it is best to estimate how much of the AUC is in the range
below the LOQ. If it is not significant, use only data above the LOQ
or extrapolate based on data above the LOQ. If it is a significant
amount, then either revalidate the method with a lower range and
reanalyze the samples or reanalyze the samples with quality control
samples in appropriate range to get an better picture of the accuracy
and precision below the LLOQ.
Back to the Top
Dear All:
I am surprised at all this continuing discussion about LLOQ. It
seems to me we have been all over this many times before, but the
discussion never seems to end.
Once again. What is the problem? It is true, in toxicology, we
have no other information except what is in the sample itself. In that
situation we must ask if the drug is present or not, and there clearly
is a LLOQ. But in PK work, we know pretty well when the doses were
given and when the samples were drawn. We also know that the
elimination of most drugs is exponential. Therefore, in PK work, the
drug is always STILL THERE - the only question is HOW MUCH, not "is it
there or not - a totally different toxicological question. Please see
Therap Drug Monit 15: 380-393, 1993. I do not understand why this
controversy goes on and on and on, seemingly forever. It is very easy
to resolve and deal with. Please look at the article and tell me what
you think.
Very best regards,
Roger Jelliffe
Roger W. Jelliffe, M.D. Professor of Medicine,
Division of Geriatric Medicine,
Laboratory of Applied Pharmacokinetics,
USC Keck School of Medicine
2250 Alcazar St, Los Angeles CA 90033, USA
Phone (323)442-1300, fax (323)442-1302, email= jelliffe.aaa.usc.edu
Our web site= http://www.lapk.org
Back to the Top
The following message was posted to: PharmPK
Re: LOQ
Pre-apologies for this continuous discussion, but I feel this has not
been properly addressed.
From the Analytical perspective, the LOQ is defined as the limit of
quantification, not to be confused w/ LOD Limit of Detection. In the
analytical world these values are determined using specific pre-defined
CV criteria and the values determined below the LOQ should not be
assigned a quantitative value. Say the LOQ is 10 ng/mL: By definition,
the LOQ means one cannot tell the difference between 0.01 or 5 or 9.5.
With that said, I cannot see any reason to justify the use of a value
determined to be 5 when it could be 9 or 0.001 or 3. The LOQ is simply
that: the Limit. Just because you can measure the existence of analyte
doesn't mean you know how much is there and since PK/PD is based on
quantification, this makes no sense.
This is of course not to say that values below the LOQ should be
reported as zero either. I strongly disagree with this tactic as well.
If a drug is given, then it will forever reside in the body at some
minuscule level. It may not be measurable after 24, 48 hrs or whatever.
After the levels drop below the LOQ, we cannot include this data in a
model. If it is below the LOQ, it should be omitted or you should
improve your assay sensitivity.
David Chin
Division of Medicine/Oncology
Stanford School of Medicine
Back to the Top
The following message was posted to: PharmPK
Dear David,
David Chin wrote:
> assigned a quantitative value. Say the LOQ is 10 ng/mL: By definition,
> the LOQ means one cannot tell the difference between 0.01 or 5 or 9.5.
> With that said, I cannot see any reason to justify the use of a value
> determined to be 5 when it could be 9 or 0.001 or 3. The LOQ is simply
> that: the Limit. Just because you can measure the existence of analyte
> doesn't mean you know how much is there and since PK/PD is based on
> quantification, this makes no sense.
What do you think happens when the true conc goes from 10 ng/mL to 9.9
ng/mL? Do you really believe that you cannot tell 9.9 from 0.01 or 5 if
your LOQ is 10 ng/mL? Given a typical CV of 20% that might be used for
defining a LOQ you may not be able to distinguish 9.9 from 9.5 reliably
but you should be able to tell 9.9 from 5. This is what quantification
is about and why PK analysts want the chemical analysts to face reality
and not use arbitrary cut off values. The limit is in the head of the
chemical analyst not in the test tube.
-
Nick Holford, Dept Pharmacology & Clinical Pharmacology
University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New
Zealand
email:n.holford.aaa.auckland.ac.nz tel:+64(9)373-7599x86730 fax:373-7556
http://www.health.auckland.ac.nz/pharmacology/staff/nholford/
Back to the Top
Dear David:
As the measured value approaches zero, the CV of course gets
bigger and bigger. That is not the point. The point is that the
variance, and thus the Fisher information, of a data point, are always
finite at any measured value, all the way down to and including a
blank. Definitions of things at a certain CV value are simply not
relevant. The SD, the variance, the Fisher info are the things, not the
CV%. The CV is simply not relevant. We should forget it, and use the
SD, the Variance, and the Fisher info. They give relevant weight to any
data point, and for PK work, where we know the relationship between the
dose and the sample, we are not asking if the drug is present or not,
as one might in toxicology. We know the drug is present, and we simply
want the best measurement of how much is there.
Very best regards,
Roger Jelliffe
Roger W. Jelliffe, M.D. Professor of Medicine,
Division of Geriatric Medicine,
Laboratory of Applied Pharmacokinetics,
USC Keck School of Medicine
2250 Alcazar St, Los Angeles CA 90033, USA
Phone (323)442-1300, fax (323)442-1302, email= jelliffe.aaa.usc.edu
Our web site= http://www.lapk.org
Back to the Top
The following message was posted to: PharmPK
Let me be straight to the point. If you are so worried about values
below
LOQ attack the problem and modify your assay to pick those
concentrations.
Being on both sides of the analytical lab, I can say calibration range
originates based on the PK person need. In my old short lived job we
had an
SOP describing LOQ as 0.05 X Cmax and UOQ can be X*Cmax X can be
1,2,3,4,
depending on the magnitude of variability on Cmax (influencing factors
such
as induction, inhibition, air, light and all physical and mythological
elements).
Having defined the linear range by the PK or clinical scientist,
practicality hits the lab in accomplishing wide calibration range,
usually
10-fold or 100 fold range is Ok with modern detection tools (Oh Boy I
really
want to avoid those weightings), it gets sticky when demanding ranges
like
1000 fold are needed. Under such conditions most practical often
neglected
approach is split standard curve.
My question is when you are doing measurements that ranges over 100 and
1000
fold, what is the over all contribution of sub LOQ concentrations in the
over all picture? If those insignificant sub -LOQ concentrations still
bothers you, answer is go back to lab and have your method straightened
rather than doing interpolations and extrapolations and this
mathematical
torture of the data.
Regards,
Prasad Tata
Mallinckrodt, Inc.
Back to the Top
The following message was posted to: PharmPK
I think Roger Jelliffe has identified one of the most significant
problems
associated with PK models at LOQ or for that matter, any very small
concentrations. The common assumption of a fixed CV
(s.d.=a*Concentration)
can have disastrous
consequences in any regression type of model, since the statistical
weight assigned to a point becomes infinite as the concentration goes to
zero, while in fact this is usually a physically unreasonable error
model. Thus the points associated with very low concentrations become
the
tail that wags the dog. A more reasonable error model at low
concentrations may be the affine
assumption s.d.=a*concentration + b, where b>0.
Bob Leary
Senior Staff Scientist
San Diego Supercomputer Center
858-534-5123
Back to the Top
The following message was posted to: PharmPK
Bob,
Bob Leary wrote:
[stuff deleted]
> A more reasonable error model at low
> concentrations may be the affine
> assumption s.d.=a*concentration + b, where b>0.
The model you suggest for residual error has been advocated many times
before on PharmPK. It is sometimes called the "slope-intercept" model
or the "mixed proportional and additive" model. But I have not heard it
called the "affine assumption" model. Would you mind explaining the
origin of this term?
Nick
--
Nick Holford, Dept Pharmacology & Clinical Pharmacology
University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New
Zealand
email:n.holford.-at-.auckland.ac.nz tel:+64(9)373-7599x86730 fax:373-7556
http://www.health.auckland.ac.nz/pharmacology/staff/nholford/
Back to the Top
The following message was posted to: PharmPK
Nick,
Mathematicians make a distinction between
linear functions f(x)=a*x and affine functions
f(x)=a*x+b, while often more applications
oriented people just loosely refer to both cases as
'linear'.
Bob Leary
Senior Staff Scientist
San Diego Supercomputer Center
858-534-5123
PharmPK Discussion List Archive Index page
Copyright 1995-2010 David W. A. Bourne (david@boomer.org)