A record (graph) of the comparison of the load cell outputs against standard test loads.
a plot of absorbance versus concentration used in calibration. The graph is often non-linear demonstrating the approximate nature of Beer's law.
Also called “standard curve”, a calibration curve is a regression of the assay response on the known concentrations in “standard” samples. It is a model that fits data from standards and is used for calculating (calibrating) values of unknown test samples. For example, measurement of protein/biomarker expression levels of various compounds from in-vitro and in-vivo samples.
The line determined by the calibration standard response data for an instrument. A mathematical function produced by regression of the detector responses recorded during calibration of an instrument. The function describes detector responses over a range of concentrations and is used to predict the concentration of an unknown sample based on its detector response.
A plot of indicated value versus true value used to adjust instrument readings for inherent error; a calibration curve is usually determined for each calibrated instrument in a standard procedure and its validity confirmed or a new calibration curve determined by periodically repeating the procedure.
also known as a working curve, the relationship of instrument response (absorbance) as a function of concentration. Ideally, this should be a linear relationship in AA, under conditions that obey Beer's Law, where absorbance = (slope x concentration) + intercept. Minor curvature can be corrected by a curve-fitting algorithms.
In analytical chemistry, a calibration curve is a general method for determining the concentration of a substance in an unknown sample by comparing the unknown to a set of standard samples of known concentration.