SlideShare une entreprise Scribd logo
1  sur  80
ME5552 - METROLOGY AND
MEASUREMENTS
R.VEERAPANDIAN,
Teaching Fellow,
CEG Campus, Anna University
Chennai-25
SYLLABUS
ME5552 METROLOGY AND MEASUREMENTS L T P C
3 0 0 3
COURSE OBJECTIVES:
The main learning objective of this course is to prepare the students for:
1. Explaining the importance of measurements in engineering and the factors affecting
measurements and to estimate measurement uncertainty.
2. Applying the working principle and applications of various linear and angular measuring
instruments and basic concepts of measurement of assembly and transmission elements.
3. Interpreting the various tolerance symbols given in engineering drawings to choose the
appropriate manufacturing process.
4. Applying the principles and methods of form and surface metrology.
5. Applying the advances in measurements for quality control in manufacturing Industries
SYLLABUS
UNIT I BASICS OF METROLOGY 9
Measurement – Need, Process, Role in quality control; Factors affecting measurement - SWIPE;
Errors in Measurements – Types – Control – Measurement uncertainty – Types, Estimation,
Problems on Estimation of Uncertainty, Statistical analysis of measurement data, Measurement
system analysis, Calibration of measuring instruments, ISO standards
UNIT II MEASUREMENT OF LINEAR, ANGULAR DIMENSIONS AND ASSEMBLY AND
TRANSMISSION ELEMENTS 9
Linear Measuring Instruments – Vernier caliper, Micrometer, Vernier height gauge, Depth
Micrometer, Bore gauge, Telescoping gauge; Gauge blocks – Use and precautions, Comparators –
Working and advantages; Opto-mechanical measurements using measuring microscope and Profile
projector - Angular measuring instruments – Bevel protractor, Clinometer, Angle gauges, Precision
level, Sine bar, Autocollimator, Angle dekkor, Alignment telescope.
Measurement of Screw threads - Single element measurements – Pitch Diameter, Lead, Pitch.
Measurement of Gears – purpose – Analytical measurement – Runout, Pitch variation, Tooth profile,
Tooth thickness, Lead – Functional checking – Rolling gear test.
SYLLABUS
UNIT III TOLERANCE ANALYSIS 9
Tolerancing – Interchangeability, Selective assembly, Tolerance representation, Terminology, Limits
and Fits, Problems (using tables); Design of Limit gauges, Problems. Tolerance analysis in
manufacturing, Process capability, tolerance stackup, tolerance charting.
UNIT IV METROLOGY OF SURFACES 9
Fundamentals of GD & T- Conventional vs Geometric tolerance, Datums, Inspection of geometric
deviations like straightness, flatness, roundness deviations, etc. Simple problems – Measurement of
Surface finish – Functionality of surfaces, Parameters, Comparative, Stylus based and Optical
Measurement techniques, Filters, Introduction to 3D surface metrology- Parameters.
UNIT V ADVANCES IN METROLOGY 9
Lasers in metrology - Advantages of lasers – Laser scan micrometers; Laser interferometers –
Applications – Straightness, Alignment; Ball bar tests, Computer Aided Metrology - Basic concept
of CMM – Types of CMM – Constructional features – Probes – Accessories – Software –
Applications – Multisensor CMMs. Machine Vision - Basic concepts of Machine Vision System –
Elements – Applications - On-line and in-process monitoring in production - Computed tomography
– White light Scanners. TOTAL = 45 PERIODS
Upon completion of this course, the students will be able to:
1. Explain the importance of measurements in engineering and the factors affecting measurements and to
estimate measurement uncertainty.
2. Apply the working principle and applications of various linear and angular measuring instruments and basic
concepts of measurement of assembly and transmission elements.
3. Interpret the various tolerance symbols given in engineering drawings to choose the appropriate manufacturing
process.
4. Apply the principles and methods of form and surface metrology.
5. Apply the advances in measurements for quality control in manufacturing Industries.
TEXT BOOKS:
1. Dotson Connie, “Dimensional Metrology”, Cengage Learning, First edition, 2012.
2. Mark Curtis, Francis T. Farago, “Handbook of Dimensional Measurement”, Industrial Press, Fifth edition, 2013.
REFERENCES:
1. AmmarGrous, J “Applied Metrology for Manufacturing Engineering”, Wiley-ISTE, 2011.
2. Galyer, J.F.W. Charles Reginald Shotbolt, “Metrology for Engineers”, Cengage Learning EMEA; 5th revised
edition, 1990.
3. National Physical Laboratory Guide No. 40, No. 41, No. 42, No. 43, No. 80, No. 118, No. 130, No. 131.
http://www.npl.co.uk. (relevant to syllabus).
4. Raghavendra N.V. and Krishnamurthy. L., Engineering Metrology and Measurements, Oxford University Press,
2013.
5. Venkateshan, S. P., “Mechanical Measurements”, Second edition, John Wiley & Sons, 2015.
UNIT I – BASICS OF METROLOGY
Measurement – Need, Process, Role in quality
control; Factors affecting measurement - SWIPE;
Errors in Measurements – Types – Control –
Measurement uncertainty – Types, Estimation,
Problems on Estimation of Uncertainty, Statistical
analysis of measurement data, Measurement
system analysis, Calibration of measuring
instruments, ISO standards
MEASUREMENTS - Introduction
• Measurement is a process of comparing
inputs with pre-defined standard and giving
the output.
• Metrology is a science of measurement.
• Metrology is also concerned with the
industrial
techniques.
inspection and its various
• For every kind of quantity measured, there
must be a unit to measure it.
MEASUREMENTS - Process
STANDARD
1. Measurand, a physical quantity such as length,
measured
weight, and angle to be
2. Reference, to compare the measurand (physical
Standard for evaluation
quantity) with a known
3.Standard/Reference, the physical quantity or property to which
quantitative
accepted.
comparisons are to be made, which is internationally
5
RESULT
(Numerical V
alue)
COMPARISON
PROCESS
MEASURAND
(Unknown Quantity)
(Known Quantity)
NEED FOR MEASUREMENT
1.
2.
3.
4.
5.
To
To
To
convert physical parameters into meaningful numbers.
determine the true dimensions of a part.
increase our knowledge and understanding of the world.
Needed for ensuring public health and human safety.
To test if the elements that constitute the system function as
per the design.
For evaluating the performance of a system.
To ensure interchangeability with a view to promoting mass
production.
To establish the validity of design and for finding new data
6.
7.
8.
and new designs.
COMPONENTS OF GENERALIZED MEASUREMENT SYSTEM
A generalized measurement system consists of the following
components:
1.
2.
3.
4.
5.
6.
Primary Sensing Element
V
ariable Conversion Element
V
ariable Manipulation Element
Data
Data
Data
Processing Element
Transmission System
Presentation Element
GENERALISED MEASURING SYSTEM
COMPONENTS OF GENERALIZED MEASUREMENT SYSTEM
1. Primary Sensing Element:
The primary sensing element receives signal of the physical quantity
to be measured as input. It converts the signal to a suitable form
(electrical, mechanical or other form), so that it becomes easier for other
elements of the measurement system, to either convert or manipulate it.
2. Variable Conversion Element:
V
ariable conversion element converts the output of the primary sensing
element to a more suitable form. It is used only if necessary.
3. Variable Manipulation Element:
V
ariable manipulation element manipulates and amplifies the output
present) in
of the variable
the signal.
conversion element. It also removes noise (if
COMPONENTS OF GENERALIZED MEASUREMENT SYSTEM
4. Data Processing Element:
It processes the data signal received from the variable manipulation
element and produces suitable output.
5. Data Transmission System:
Data Transmission System is simply used for transmitting data from
one element to another. It acts as a communication link between
different elements of the measurement system.
6. Data Presentation Element:
It is used to present the measured physical quantity in a human
readable form to the observer. LED displays are most commonly used
many measurement systems.
as data presentation elements in
ntity -
COMPONENTS OF GENERALIZED MEASUREMENT SYSTEM
Manipulation Element
Element Conversion
Transmission Element
Conversion Element
– Pressure
Sensing Element - Bulb Presentation Element
Physical Qua
Temperature
STANDARDS
 In metrology (the science of measurement), a standard is an object,
or system that bears a defined relationship to a
physical quantity.
unit of measurement of a
 Depending on functions and applications,
are classified as follows:
standards of measurement
(i) International Standards
(ii)
(iii)
(iv)
Primary Standards
Secondary Standards
Working Standards
i. International Standards
 Defined by International agreement
 Periodically evaluated & checked by absolute measurements in terms
of fundamental units of physics
 represent certain units of measurement to the closest possible
accuracy attainable by the science and technology of measurement
 These standards are not available to ordinary uses like measurement
and calibrations.
ii. Primary Standards
 Main function is the calibration and verification of secondary
standards
These are maintained at the National Standards Laboratories in

different countries.
New Delhi.
For India, it is National Physical Laboratory at
 The primary standards are not available for the use outside the
National Laboratory.
These primary standards are absolute standards of high accuracy that
can be used as ultimate reference standards to check, calibrate and
certify the secondary standards.

iii. Secondary Standards
 Basic reference standards used by the measurement and calibration
laboratories in industries
These standards are maintained by the particular industry to which
they belong
Each industry has its own secondary standard
Each laboratory periodically sends its secondary standard to the
national standards laboratory for calibration and comparison against
the primary standard
After comparison and calibration, the National Standards Laboratory
returns the secondary standards to the particular industrial laboratory




with a certification of measuring accuracy in terms of primary
standards
iv. Working Standards


main tools of a measuring laboratory
used to check and calibrate laboratory instrument for accuracy and
performance.
For example, manufacturing of mechanical components such as

shafts, bearings, gears etc, use a standard called working standard for
for
checking
checking
the component dimensions. Example: Plug gauge is used
the bore diameter of bearings.
TYPES OF MEASUREMENTS /
METHODS OF MEASUREMENTS
1. Direct Comparison 6. Contact Method
2. Indirect Comparison 7. Transposition Method
3. Comparative Method 8. Complementary Method
4. Coincidence Method 9. Deflection Method
5. Fundamental Method 10. Contactless method
1. Direct Method

obtained.
Measurements are directly
Ex.: Vernier Caliper, Scales.
2. Indirect Method
 Obtained by measuring other quantities.
Ex: Measurement of strain induced in a
bar due to the applied force
3. Comparative Method
It’s compared with other known
value.
Ex: Comparators.
4. Coincidence Method:
Measurements coincide with certain
lines and signals. Ex: Comparators.
5. Fundamental Method:
Measuring
quantity.
a quantity directly in related with the definition of that
6. Transposition Method:
Quantity
balanced
balanced
to be measured is first
by a known value and then
by an other new value.
Ex: Determination of mass by
balancing methods.
7. Complementary Method:
The value of quantity to be measured is
combined with known value of the same
quantity.
Ex: Determination of the volume of a
solid by liquid displacement Volume.
8. Deflection Method:
The value to be measured is directly
indicated by a deflection of pointer.
Ex: Pressure Measurement.
9. Contact Method:
Sensor/Measuring tip touch the surface area.
Ex: Vernier Caliper.
10. Contactless method:
There is no direct contact with the surface to be measured.
E0
x
6-.Jum
l-18easurement by optical instruments
TYPES OF MEASURING INSTRUMENTS
1. Deflection and Null type instruments
2. Analog and Digital instruments
3. Active and passive type instruments
4. Automatic and manually operated instruments
5. Absolute and secondary instruments
6. Contacting and non-contacting instruments
7. Intelligent instruments
1. Deflection and Null type instruments
 The weight of the object is
indicated by the
a
deflection or
movement of pointer
Ex.
on a
graduated
Balance
scale. Spring
 The effect caused by the quantity to be
measured is nullified.
 For example, consider the measurement of
weight by an ordinary beam balance as
shown in fig. The unknown weight placed
in one-side causes the beam and the
pointer to deflect. Ex. Beam Balance
2. Analog and Digital Instruments
3. Active and Passive Type Instruments
 In active instruments, the quantity being
measured just activates the magnitude of
some, external power input source, which
in turn produces the measurement.
 In this type of instruments, another
external energy input source is present
apart from the quantity to be measured.
 In passive type instruments, output is
produced entirely
measured.
by the quantity being
4. Manual and Automatic Instruments
 Manual instruments require the services of a
human operator.
 When the process of null balance is automated, it
is known termed as automatic instruments.
5. Absolute and Secondary Instruments
 Absolute instruments are those which
give the value of the quantity to be
the
measured, in terms of the constants of
instrument and their deflection only.
 Secondary Instrument shows deflection
directly in terms of electrical quantity like
voltage, current, power and frequency.
 These instruments are calibrated by
comparison with an absolute instrument.
6. Contacting and Non-Contacting Instruments
7. Intelligent Instrument
 Microprocessor are incorporated with measuring instrument
ACCURACY AND PRECISION
Accuracy is the ability of the instrument to measure the
accurate value (Conformity).
Precision refers to how closely individual measurements agree
with each other (Repeatability).
FACTORS AFFECTING ACCURACYAND PRECISION
OF A MEASURING SYSTEM
• A measuring
These are:
system is made of five basic elements.
1.
2.
3.
4.
5.
Standard
Work piece
Instrument
Person
Environment.
FACTORS AFFECTING ACCURACY
1. Standard
OF A MEASURING SYSTEM
Coefficient of thermal expansion
Stability with time
Elastic properties
Position etc
2. Work piece:
Cleanliness surface
Surface defects
Hidden geometry
finish etc.
3. Instrument
Inadequate amplification
Scale error
Deformation while handling
Calibration error
Repeatability & Readability
heavy w/p
FACTORS AFFECTING ACCURACY OF A MEASURING SYSTEM
4. Person
Training skill
Sense of precision appreciation
Ability to select measuring instrument & standard
Attitude towards personal accuracy achievement
Planning for measurement technique to have
consistent in precision.
minimum just with
5. Environment
Temperature, pressure and humidity
Clean surrounding and minimum vibration
Adequate illumination
Temperature equalization between standard w/p & instrument
ERRORS IN MEASUREMENT
What is Error in Measurement?
• Measurement Error (Observational Error)
difference between a measured(actual) value
true value.
is the
and its
• True size  Theoretical size of a dimension which is
free from errors.
• Actual size  size obtained through measurement
with permissible error.
Errors in Measurements
• Error in Measurement = Measured V
alue − True V
alue
• The error can be expressed or evaluated either as an absolute or
as a relative error.
Absolute error
• True absolute error - It is the algebraic difference between the
result of measurement and the conventional true value of the
quantity measured.
• Apparent absolute error - If the series of measurement are made
then the algebraic difference between one of the results of
measurement and the arithmetical mean is known as apparent
absolute error.
Relative Error
• It is the quotient of the absolute error and the value of
comparison (may be true value or the arithmetic mean of a series
of measurements) used for calculation of the absolute error.
TYPES OF ERRORS IN MEASUREMENT
Types of Errors in Measurements
Measurement Errors
Gross / Blunder Errors
Systematic Errors Random Errors
Theoretical
Errors
Instrumental
Errors
Observational
Errors
Environmental
Errors
TYPES OF ERRORS IN MEASUREMENT
1) Gross or Blunder Errors:
This category of errors includes all the human mistakes while
reading, recording the readings. The best example of these errors is
a person or operator reading pressure gauge 1.01N/m2 as
1.10N/m2.
2)

Measurement Error:
The measurement error
measurement of the true
is the
value.
result of the variation of a
 Usually, Measurement error consists of a random error and
systematic error.
TYPES OF ERRORS IN MEASUREMENT
a.
•
Systematic Error (Controllable Error)
A systematic error is a constant error that under the same
operating conditions.
Systematic error is caused by any factors that systematically affect
measurement .
•
•
i.
Classification of systematic errors:
Instrumental Errors – Calibration Error
ii. Environmental Errors – Temp, Pressure, Humidity
iii. Observational Errors - Parallax
iv. Theoretical - Percentage
Types of Errors
Controllable or Systematic errors
Atype of error that deviates by a fixed amount from the true
value of measurement.
These types of errors are controllable in both their magnitude
and their direction, and can be assessed and minimized if
efforts are made to analyze them.
In order to assess them, it is important to know all the sources of
such errors.
Examples of such errors include measurement of length using a
metre scale, measurement of current with inaccurately calibrated
ammeters, etc.
Minimization of systematic errors increases the accuracy of
measurement.
I.
•
•
•
•
•
Types of Errors
1. Calibration errors
These types of errors are caused due to variation in the calibrated scale from its nominal
value.
2. Ambient conditions
• It is essential to maintain the ambient conditions at internationally accepted values of
standard temperature (20 ºC) and pressure (760 mmHg) conditions. A small difference
of 10 mmHg can cause errors in the measured size of the component.
• The most significant ambient condition affecting the accuracy of measurement is
temperature. An increase in temperature of 1 ºC results in an increase in the length of
C25 steel by 0.3 μm, and this is substantial when precision measurement is required.
• Humidity, thermal gradients, vibrations, and CO2 content of the air affect the
refractive index.
Errors in measurements
3.Deformation of workpiece
• Any elastic body, when subjected to a load, undergoes elastic deformation.
The stylus pressure applied during measurement affects the accuracy of
measurement.
• Due to a definite stylus pressure, elastic deformation of the workpiece and
deflection of the workpiece shape may occur. The magnitude of
deformation depends on the applied load, area of contact, and mechanical
properties of the material of the given workpiece. Therefore, during
comparative measurement, one has to ensure that the applied measuring
loads are same.
Elastic deflection due to Stylus pressure
Errors in measurements
4.Avoidable errors
• These errors mainly occur due to misalignment of the work-piece centers. The
error due to misalignment is caused, when the centre line of work-piece is not
perpendicular to the lines of graduations or divisions engraved on scale.
• Datum errors – Difference between the true value and the indicated value
• Reading errors – The errors occur due to mistakes committed by the observer.
Error due to poor contact
Poor contact between the working gauge or a the
instrument and workpiece causes an error as
shown in Fig. To avoid such error, an instrument
with a wide area of contact should not be used
while measuring irregular or curved surfaces, and
the correct contact pressure must be applied.
Therefore, instrument loading error is the
difference between the value of the measurand
before and after the measuring system is
connected or contacted for measurement..
Errors in measurements
4.Avoidable errors
Parallax error
• Apparent shift in the position of an object when is it is viewed
from various angles. Parallax error occurs when the pointer in a
scale is not observed along a normal a line normal to the scale.
Misalignment Error
Effect of misalignment These occur due to the inherent inaccuracies present in the
measuring instruments. These errors may also be due to improper use, handling, or
selection of the instrument. Wear on the micrometer anvils or anvil faces not being
perpendicular to the axis results in misalignment, leading to inaccurate measurements. If
the alignment is not proper, sometimes sine and cosine errors also contribute to the
inaccuracies of the measurement.
Zero error
• Datum measurement error.
TYPES OF ERRORS IN MEASUREMENT
Random Errors (Uncontrollable Error)
• Random (or indeterminate) errors are caused by uncontrollable
fluctuations in variables that affect experimental results.
• Random errors are caused by the sudden change in experimental
conditions and noise and tiredness in the working persons.
errors are either positive or negative.
These
• These errors may be reduced by taking the average of a large
number of readings.
Errors in measurements
II. Random errors – Uncontrollable errors
• Random errors are non-consistent. Random errors provide a
measure of random deviations when measurements of a
physical quantity are carried out repeatedly. When a series
of repeated measurements are made on a component under
similar conditions, the values or results of measurements
vary.
Specific causes for these variations cannot be determined,
since these variations are unpredictable and uncontrollable
by the experimenter and are random in nature.
They are of variable magnitude and may be either positive
or negative. Can be evaluated statistically.
Random errors can be minimized by calculating the average
of a large number of observations.
Random errors limit the precision of the instrument.
•
•
•
•
Errors in measurements
• Sources of Random errors:
1.
2.
3.
Presence of transient fluctuations in friction in the measuring instrument.
Play in the linkages of the measuring instruments.
Error in operator’s judgment in reading the fractional part of engraved
scale divisions.
Operator’s inability to note the readings because of fluctuations during
measurement.
4.
5. Positional errors associated with the measured object and standard,
arising due to small variations in setting.
Distinction between Systematic and Random Error
Metrological terminologies
True value (of a quantity) -The true value of a quantity is defined as the
value consistent with its definition. This implies that there are no
measurement errors in the realization of the definition.
• Example - Density of a substance is defined as mass per unit volume. If mass
and volume of the substance could be determined without making
measurement errors, then the true value of the density can be obtained.
• Unfortunately in practice both these quantities cannot be determined without
experimental error. Therefore the true value of a quantity cannot be
determined experimentally.
Nominal value and conventional true value - The nominal value is the
approximate or rounded-off value of a material measure or characteristic
of a measuring instrument.
• For example, when we refer to a weight as 1 kg, we are using their nominal
values. Their exact values, known as conventional true values, may be 99.98
ohms and 1.0001 kg respectively.
Error - The difference between the result of a measurement and its true
value is known as the error of the measurement. Since a true value cannot
be determined, the error as defined cannot be determined as well.A
conventional true value is therefore used in practice to determine an error
.
•
•
•
Metrological terminologies
Accuracy of a measurement is the degree of its closeness of to the true
value.
The precision of a measurement is the degree of scatter of the measurement
result, when the measurement is repeated a number of times under specified
conditions.
Trueness is the closeness of the mean of a set of measurement results to the
actual (true) value.
The field of statistics, prefers to use the terms bias and variability instead
of accuracy and precision:
Bias is the amount of inaccuracy and variability is the amount of
imprecision.
•
•
•
•
•
As per ISO 5725-1
Metrological terminologies
• The vertical axis represents the frequency
of the measurement result and the
horizontal axis represents the values of the
results (X).
The central vertical line represents the
mean value of all the measurement results.
The vertical line marked T represents the
true value of the measurand.
The difference between the mean value and
the T line is the accuracy of the
measurement.
The standard deviation (σx ) of all the
measurement results about the mean value
is a quantitative measure for the precision
of the measurement.
The line (S ) represents the mean value
obtained using a higher level measurement
standard.
•
•
•
•
•
Metrological Terminologies
• Accuracy of an instrument is always assessed in terms of error. The
difference between the true value and the mean value of the set of
readings on the same component is termed as an error. Error can
also be defined as the difference between the indicated value and
the true value of the quantity measured.
E = Vm − Vt where E is the error, Vm the measured value, and Vt the true value.
Accuracy of an instrument can also be expressed as % error. If an instrument
measures Vm instead of Vt , then,
% error = (Vm − Vt) / Vt) x 100
• Two terms are associated with accuracy: sensitivity and
consistency. Sensitivity is the ability of the measuring equipment to
detect small variations in the quantity being measured.
• When successive readings of the measured quantity obtained from
the measuring instrument are same all the time, the equipment is
said to be consistent.
• Ahighly accurate instrument possesses both sensitivity and
consistency.
Metrological terminologies
In engineering, precision is often taken as three times standard deviation
(SD) of measurements taken, representing the range that 99.73% of
measurements can occur within. Precision includes:
Repeatability is the closeness of agreement (usually expressed as standard
deviation) between the results of successive measurements based on the
same measurand and carried out under identical measurement conditions.
The measurements that are performed under the same conditions are
termed repeatability, that is, same process, same operator, same
apparatus (instrument) used under the same conditions, the same place, at
the same time, and with the same quantitative expressions of the results,
using the same equipment and within a short-time interval.
Reproducibility is the closeness of agreement (usually expressed as SD)
between the results of successive measurements based on the same
measurand and carried out under different measurement conditions (to
be specific: principle, method, operator, device, apparatus, reference
material (standard), place, conditions, time (date), and quantitative
expression of the result) over longer time periods.
•
•
•
Metrological Terminologies
Accuracy, Precision and Reliability
Metrological Terminologies
Accuracy, Precision and Reliability
• Accuracy is a
comparison: the desired
result is compared with
the actual measurement,
thus also called “quality
of conformity”.
Precision reports the
dispersement of results
or the degree of
repeatability within the
manufacturing and
measurement systems,
thus called “quality of
refinement”.
Reliability shows the
relationship between the
predicted results and the
actual.
•
•
Metrological Terminologies
Accuracy and Cost
• The cost increases exponentially as the
requirement of accuracy increases.
If the tolerance of a component is to be
measured, then the accuracy
requirement will normally be 10% of
the tolerance values.
Demanding high accuracy unless it is
absolutely required is not viable, as it
increases the cost of the measuring
equipment and hence the inspection
cost.
It makes the measuring equipment
unreliable, because higher accuracy
increases sensitivity.
•
•
•
Relationship of accuracy with cost
Metrological terminologies
• Sensitivity
• The ratio of the change of instrument indication to the change of
quantity being measured is termed as sensitivity. In other words, it
is the ability of the measuring equipment to detect small
variations in the quantity being measured.
• The permitted degree of sensitivity determines the accuracy of the
instrument.An instrument cannot
permitted degree of sensitivity.
be more accurate than the
• Stability refers to the ability
of a measuring instrument to
constantly maintain its
metrological characteristics
with time.
Metrological Terminologies
Uncertainty
• Uncertainty is the quantification of the level of doubt we have
about any measurement.
Uncertainty means how certain or uncertain are we on
measured readings.
No reading is 100% accurate. It has some level of uncertainty
present.
Uncertainty includes many factors like Personnel skill, resolution
of instrument, and accuracy of instrument, etc.
We should combine Uncertainty and Error to find whether given
instrument is within accuracy or within tolerance or out of
accuracy or tolerance.
Most of the times we compare only ‘Error andAccuracy’or ‘Error
and Tolerance’and Uncertainty factor is not considered. This is a
wrong practice.
•
•
•
•
•
Metrological terminologies
Drift is an undesired gradual departure of the instrument output over a
period of time that is unrelated to changes in input, operating conditions or
load.
Inspection is defined as a procedure in which a part or product
characteristic, such as a dimension, is examined to determine whether it
conforms to the design specification. Basically, inspection is carried out to
isolate and evaluate a specific design or quality attribute of a component or
product.
Linearity is an indicator of the consistency of measurements over the
entire range of measurements. Asystem is said to be linear if proportional
changes in input measurements produce proportional changes in output
measurement.
Magnification - In order to measure small difference in dimensions, the
movement of the measuring tip in contact with work must be magnified
and, therefore, the output signal from a measuring instrument is to be
magnified many times to make it more readable.
Range – It is defined as the difference between the lower and higher values
that an instrument is able to measure. If an instrument has a scale reading
of 0.01–100 mm, then the range of the instrument is 0.01–100 mm
•
•
•
•
•
Metrological terminologies
• Readability refers to the ease with which the readings of a
measuring instrument can be read.
• Fine and widely spaced graduation lines improve the
readability.
• To make the micrometers more readable they are provided
with vernier scale or magnifying devices.
• Response time is the time required for an instrument to respond to
a change in input signal.
• Resolution It is the fineness to which an instrument can be read.
• It is also called as discrimination and defined as the smallest
increment of the input signal that a measuring system is capable of
displaying.
• Smallest change in a quantity being measured that causes a
perceptible change in the corresponding indication.
Calibration
What is calibration?
Calibration is an activity of checking / verifying the accuracy of a
measuring instrument by comparison with a standard of higher
accuracy (gauge block). It may also include adjustment of the
instrument to bring it into alignment with the standard.
Purpose is to detect, correct and document the instrument
performance and uncertainty.
It also determines the traceability of the measurement.
Calibration
Why calibration is important?
• Degrading of instrument accuracy due to normal wear and tear, due
to mechanical & thermal shock and hazardous manufacturing
environment (ex - oils, metal chips etc.).
• The type of instrument and the level environment may degrade very
quickly or over a long period of time.
• Four main reasons for having an instrument calibrated:
1.
2.
To establish and demonstrate traceability.
To ensure readings from the instrument are consistent with
other measurements.
To determine the accuracy of the instrument readings.
To establish the reliability of the instrument i.e. that it can be
trusted.
3.
4.
Calibration of instruments
Calibration of External micrometer
Parameters to be verified:
1.
2.
3.
Scale – Deviation of reading
Zero error in the fully closed position
Flatness of the measuring surfaces,
i.e. of the anvil and the spindle;
Parallelism between the measuring surfaces.
4.
Calibration of instruments
Calibration of External micrometer
• Deviation of reading
• The accuracy of a micrometer over its working
range is determined by measuring a set of gauge
blocks of appropriate class, usually grade 1 or 2,
using the micrometer under calibration.
The anvil faces are cleaned and the micrometer is
wound forward until the anvil touch check reads
zero, if not the spanner to rotate the barrel to line
up datum line with zero.
The micrometer is firmly held in a rigid stand and
the gauge block is introduced between the anvils.
The movable anvil is rotated until it comes in
contact with the datum surface of the gauge block.
The micrometer reading is taken and compared
with the actual length of the gauge block, obtained
from its calibration certificate.
The procedure is repeated at least three times at
each major graduation and the mean value of the
reading computed.
•
•
•
Calibration of External micrometer
Flatness of measuring faces
• The flatness of measuring faces is tested by using an optical flat
(Fig.)An optical flat of diameter 45 mm or 60 mm is generally
used.After cleaning the measuring face of the anvil thoroughly,
one surface of the optical flat is brought into contact with the
measuring face. Unless the faces are perfectly flat a number of
coloured interference bands will be seen on their surfaces. The
shape and the number of these bands indicate the degree of
flatness of the face.Aband represents a distance of 0.32 μm.
Testing of flatness of a micrometer measuring face
Calibration of External micrometer
Parallelism of measuring faces
• Parallelism of measuring faces is determined by either using optical
parallels or gauge blocks.
The optical parallel is first placed on the measuring face of the anvil and
carefully moved until the bands visible on the face are reduced to a
minimum.
The measuring face of the spindle is then brought into contact with the
optical parallel. The number of bands visible on both faces gives the
parallelism of the measuring faces.
For micrometers of range 50 mm and larger it is more convenient to use
gauge blocks.Agauge block is placed at five positions between two
measuring faces. The maximum difference of the five readings is
considered as the parallelism of the measuring faces. Testing is
preferably done at two angular positions of the anvil, by using two gauge
blocks differing in length by 0.25 mm.
•
•
•
Calibration of External micrometer
Requirement of a micrometer generally specified in international standards
Calibration of Vernier calipers
Deviation of reading - The deviation of reading is determined by the use of
gauge blocks or end bars or by the use of a measuring machine. The deviations
of reading at not less than five positions, equally spaced within the measuring
range of the main scale and the vernier scale, are determined.
Flatness of the measuring faces - The flatness of the measuring faces for both
external and internal measurement is determined by using either a dial test
indicator or an optical flat. When a dial test indicator is used, the test instrument
is laid horizontally on a surface plate and the tip of the dial test indicator is
traversed along the surface of the measuring face. The maximum deviation of
the indicator is taken as a measure of flatness.
Parallelism of the measuring faces - Parallelism of the measuring faces is
determined by inserting gauge blocks at different points on the jaws or by using
a measuring machine. Generally parallelism is determined at two measured
lengths, mid range and close to full range.
Squareness of the fixed face - The squareness of the fixed face for external
measurement with the guiding edge of the beam is determined by holding a
gauge block of comparable length against the edge of the beam and fixed
measuring face.
•
•
•
•
Calibration of instruments
Calibration report
• It provides a set of results that show the performance of the
instrument being calibrated in comparison to a reference
standard.
Calibration certificate
• It states the compliance of an instrument with the manufacturer’s
specification.
Calibration Certificate/ Report
The calibration report/ certificate shall include the following additional
information:
a) Date of receipt of the item and date of completion of the calibration
work
Environmental conditions maintained during the measurements
Signature and title of authorized person (authorized signatory)
accepting responsibility for the report and date of issue
b)
c)
d) Astatement of the accreditation measurement capability relevant
the job under calibration
Asymbol of NABL or statement or both clarifying the status of
accreditation of the laboratory.
The Uncertainty of measurement
An evidence that the measurements are traceable to National/
International Standards through unbroken chain of Accredited
Laboratories.
to
e)
f)
g)
Uncertainty
• Uncertainty is the quantification of the level of doubt we have
about any measurement.
Uncertainty means how certain or uncertain are we on
measured readings.
No reading is 100% accurate. It has some level of uncertainty
present.
Uncertainty includes many factors like Personnel skill, resolution
of instrument, and accuracy of instrument, etc.
We should combine Uncertainty and Error to find whether given
instrument is within accuracy or within tolerance or out of
accuracy or tolerance.
Most of the times we compare only ‘Error andAccuracy’or ‘Error
and Tolerance’and Uncertainty factor is not considered. This is a
wrong practice.
•
•
•
•
•
Uncertainty
Uncertainty –As per ISO 15189 uncertainty is defined as “a parameter
associated with the result of a measurement that characterises the
dispersion of the values” that could reasonably be attributed to the
measured quantity.
Together with a measurement result, the uncertainties makes it possible
to provide a quantitative indication about the quality of the result.
This helps to assess its reliability.
When the uncertainty in the measurement increases from true value
approach toward uncertainty approach, it leads to the reconsideration of
the measurement concepts.
We know that both the instruments and the measurements do not provide
“this” true value. Therefore, it is possible to differentiate two categories
of errors. They should be considered differently in terms. However, as
no justified rule underlies the combination between systematic and
random errors, it results in total error, which characterizes the
measurement result. The estimated upper limit of the total error is
named as uncertainty.
•
•
•
•
Uncertainty
• Estimated quality of a result is expressed as the uncertainty.
• Therefore, the uncertainty becomes an essential part of a
measurement result.
Measurement result = (Estimate ± uncertainty), i.e.,
Measurement of a measurand = a numerical value +
uncertainty.
= 50 m ±1 m = 49 m – 51 m
= (50.000 ±0.001) m
= 49.999 – 50.001
a unit + an
Example, length
length
• The uncertainty gives the limits of the range in which the “true”
value of the measurand is estimated to be at a given probability.
Uncertainty
The components of measurement uncertainty grouped into two
categories:
TypeAevaluation – Evaluation of a component of measurement
uncertainty by a statistical analysis of measured quantity values
obtained under defined measurement conditions – Random
effects.
Type B evaluation - Evaluation of a component of measurement
uncertainty determined by means other than a Type A evaluation
of uncertainty – Systematic effects.
Examples – Evaluation based on information
• Associated with authoritative published quantity values.
• Associated with the quantity value certified reference
material.
• Obtained from a calibration certificate
• Obtained from the accuracy class of a verified instrument.
• Obtained from limits deduced through personal experience.
•
•
•
•
Sources of uncertainty in a measurement
Human, Technical and Environment are the three major sources of uncertainties.
The
a)
b)
c)
list of the sources are:
incomplete definition of the measurand;
imperfect reaIization of the definition of the measurand;
nonrepresentative sampling — the sample measured may not represent the
defined measurand;
inadequate knowledge of the effects of environmental conditions on the
measurement or imperfect measurement of environmental conditions;
personal bias in reading analogue instruments;
finite instrument resolution or discrimination threshold;
inexact values of measurement standards and reference materials;
inexact values of constants and other parameters obtained from external
sources and used in the data-reduction algorithm;
approximations and assumptions incorporated in the measurement method
and procedure;
variations in repeated observations of the measurand under apparently
identical conditions.
d)
e)
f)
g)
h)
i)
j)

Contenu connexe

Similaire à UNIT-I BASICS OF METROLOGY.pptx

Similaire à UNIT-I BASICS OF METROLOGY.pptx (20)

Measurement and Metrology
Measurement and MetrologyMeasurement and Metrology
Measurement and Metrology
 
Guidelines to Understanding Design of Experiment and Reliability Prediction
Guidelines to Understanding Design of Experiment and Reliability PredictionGuidelines to Understanding Design of Experiment and Reliability Prediction
Guidelines to Understanding Design of Experiment and Reliability Prediction
 
ppt_mm.pptx
ppt_mm.pptxppt_mm.pptx
ppt_mm.pptx
 
Unit 1 CONCEPT OF MEASUREMENT
Unit 1 CONCEPT OF MEASUREMENTUnit 1 CONCEPT OF MEASUREMENT
Unit 1 CONCEPT OF MEASUREMENT
 
292741121 gauge rr_for_an_optical_micrometer_industrial_type_machine
292741121 gauge rr_for_an_optical_micrometer_industrial_type_machine292741121 gauge rr_for_an_optical_micrometer_industrial_type_machine
292741121 gauge rr_for_an_optical_micrometer_industrial_type_machine
 
3. ICS lab manual
3. ICS lab manual 3. ICS lab manual
3. ICS lab manual
 
DEFINITIONS- CALIBRATION.pptx
DEFINITIONS- CALIBRATION.pptxDEFINITIONS- CALIBRATION.pptx
DEFINITIONS- CALIBRATION.pptx
 
Basic of metrology
Basic of metrology Basic of metrology
Basic of metrology
 
Basics of metrology
Basics of metrology Basics of metrology
Basics of metrology
 
Syllabus
SyllabusSyllabus
Syllabus
 
Syllabus fgf
Syllabus fgfSyllabus fgf
Syllabus fgf
 
Syllabus
SyllabusSyllabus
Syllabus
 
Syllabus
SyllabusSyllabus
Syllabus
 
Syllabus
SyllabusSyllabus
Syllabus
 
Syllabus
SyllabusSyllabus
Syllabus
 
Syllabus
SyllabusSyllabus
Syllabus
 
Syllabus
SyllabusSyllabus
Syllabus
 
Syllabus
SyllabusSyllabus
Syllabus
 
Syllabus
SyllabusSyllabus
Syllabus
 
Syllabus
SyllabusSyllabus
Syllabus
 

Dernier

ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsanshu789521
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,Virag Sontakke
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxsocialsciencegdgrohi
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerunnathinaik
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxUnboundStockton
 
Science lesson Moon for 4th quarter lesson
Science lesson Moon for 4th quarter lessonScience lesson Moon for 4th quarter lesson
Science lesson Moon for 4th quarter lessonJericReyAuditor
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
 

Dernier (20)

ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha elections
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developer
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docx
 
Science lesson Moon for 4th quarter lesson
Science lesson Moon for 4th quarter lessonScience lesson Moon for 4th quarter lesson
Science lesson Moon for 4th quarter lesson
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
 

UNIT-I BASICS OF METROLOGY.pptx

  • 1. ME5552 - METROLOGY AND MEASUREMENTS R.VEERAPANDIAN, Teaching Fellow, CEG Campus, Anna University Chennai-25
  • 2. SYLLABUS ME5552 METROLOGY AND MEASUREMENTS L T P C 3 0 0 3 COURSE OBJECTIVES: The main learning objective of this course is to prepare the students for: 1. Explaining the importance of measurements in engineering and the factors affecting measurements and to estimate measurement uncertainty. 2. Applying the working principle and applications of various linear and angular measuring instruments and basic concepts of measurement of assembly and transmission elements. 3. Interpreting the various tolerance symbols given in engineering drawings to choose the appropriate manufacturing process. 4. Applying the principles and methods of form and surface metrology. 5. Applying the advances in measurements for quality control in manufacturing Industries
  • 3. SYLLABUS UNIT I BASICS OF METROLOGY 9 Measurement – Need, Process, Role in quality control; Factors affecting measurement - SWIPE; Errors in Measurements – Types – Control – Measurement uncertainty – Types, Estimation, Problems on Estimation of Uncertainty, Statistical analysis of measurement data, Measurement system analysis, Calibration of measuring instruments, ISO standards UNIT II MEASUREMENT OF LINEAR, ANGULAR DIMENSIONS AND ASSEMBLY AND TRANSMISSION ELEMENTS 9 Linear Measuring Instruments – Vernier caliper, Micrometer, Vernier height gauge, Depth Micrometer, Bore gauge, Telescoping gauge; Gauge blocks – Use and precautions, Comparators – Working and advantages; Opto-mechanical measurements using measuring microscope and Profile projector - Angular measuring instruments – Bevel protractor, Clinometer, Angle gauges, Precision level, Sine bar, Autocollimator, Angle dekkor, Alignment telescope. Measurement of Screw threads - Single element measurements – Pitch Diameter, Lead, Pitch. Measurement of Gears – purpose – Analytical measurement – Runout, Pitch variation, Tooth profile, Tooth thickness, Lead – Functional checking – Rolling gear test.
  • 4. SYLLABUS UNIT III TOLERANCE ANALYSIS 9 Tolerancing – Interchangeability, Selective assembly, Tolerance representation, Terminology, Limits and Fits, Problems (using tables); Design of Limit gauges, Problems. Tolerance analysis in manufacturing, Process capability, tolerance stackup, tolerance charting. UNIT IV METROLOGY OF SURFACES 9 Fundamentals of GD & T- Conventional vs Geometric tolerance, Datums, Inspection of geometric deviations like straightness, flatness, roundness deviations, etc. Simple problems – Measurement of Surface finish – Functionality of surfaces, Parameters, Comparative, Stylus based and Optical Measurement techniques, Filters, Introduction to 3D surface metrology- Parameters. UNIT V ADVANCES IN METROLOGY 9 Lasers in metrology - Advantages of lasers – Laser scan micrometers; Laser interferometers – Applications – Straightness, Alignment; Ball bar tests, Computer Aided Metrology - Basic concept of CMM – Types of CMM – Constructional features – Probes – Accessories – Software – Applications – Multisensor CMMs. Machine Vision - Basic concepts of Machine Vision System – Elements – Applications - On-line and in-process monitoring in production - Computed tomography – White light Scanners. TOTAL = 45 PERIODS
  • 5. Upon completion of this course, the students will be able to: 1. Explain the importance of measurements in engineering and the factors affecting measurements and to estimate measurement uncertainty. 2. Apply the working principle and applications of various linear and angular measuring instruments and basic concepts of measurement of assembly and transmission elements. 3. Interpret the various tolerance symbols given in engineering drawings to choose the appropriate manufacturing process. 4. Apply the principles and methods of form and surface metrology. 5. Apply the advances in measurements for quality control in manufacturing Industries. TEXT BOOKS: 1. Dotson Connie, “Dimensional Metrology”, Cengage Learning, First edition, 2012. 2. Mark Curtis, Francis T. Farago, “Handbook of Dimensional Measurement”, Industrial Press, Fifth edition, 2013. REFERENCES: 1. AmmarGrous, J “Applied Metrology for Manufacturing Engineering”, Wiley-ISTE, 2011. 2. Galyer, J.F.W. Charles Reginald Shotbolt, “Metrology for Engineers”, Cengage Learning EMEA; 5th revised edition, 1990. 3. National Physical Laboratory Guide No. 40, No. 41, No. 42, No. 43, No. 80, No. 118, No. 130, No. 131. http://www.npl.co.uk. (relevant to syllabus). 4. Raghavendra N.V. and Krishnamurthy. L., Engineering Metrology and Measurements, Oxford University Press, 2013. 5. Venkateshan, S. P., “Mechanical Measurements”, Second edition, John Wiley & Sons, 2015.
  • 6. UNIT I – BASICS OF METROLOGY Measurement – Need, Process, Role in quality control; Factors affecting measurement - SWIPE; Errors in Measurements – Types – Control – Measurement uncertainty – Types, Estimation, Problems on Estimation of Uncertainty, Statistical analysis of measurement data, Measurement system analysis, Calibration of measuring instruments, ISO standards
  • 7. MEASUREMENTS - Introduction • Measurement is a process of comparing inputs with pre-defined standard and giving the output. • Metrology is a science of measurement. • Metrology is also concerned with the industrial techniques. inspection and its various • For every kind of quantity measured, there must be a unit to measure it.
  • 8. MEASUREMENTS - Process STANDARD 1. Measurand, a physical quantity such as length, measured weight, and angle to be 2. Reference, to compare the measurand (physical Standard for evaluation quantity) with a known 3.Standard/Reference, the physical quantity or property to which quantitative accepted. comparisons are to be made, which is internationally 5 RESULT (Numerical V alue) COMPARISON PROCESS MEASURAND (Unknown Quantity) (Known Quantity)
  • 9. NEED FOR MEASUREMENT 1. 2. 3. 4. 5. To To To convert physical parameters into meaningful numbers. determine the true dimensions of a part. increase our knowledge and understanding of the world. Needed for ensuring public health and human safety. To test if the elements that constitute the system function as per the design. For evaluating the performance of a system. To ensure interchangeability with a view to promoting mass production. To establish the validity of design and for finding new data 6. 7. 8. and new designs.
  • 10. COMPONENTS OF GENERALIZED MEASUREMENT SYSTEM A generalized measurement system consists of the following components: 1. 2. 3. 4. 5. 6. Primary Sensing Element V ariable Conversion Element V ariable Manipulation Element Data Data Data Processing Element Transmission System Presentation Element
  • 12. COMPONENTS OF GENERALIZED MEASUREMENT SYSTEM 1. Primary Sensing Element: The primary sensing element receives signal of the physical quantity to be measured as input. It converts the signal to a suitable form (electrical, mechanical or other form), so that it becomes easier for other elements of the measurement system, to either convert or manipulate it. 2. Variable Conversion Element: V ariable conversion element converts the output of the primary sensing element to a more suitable form. It is used only if necessary. 3. Variable Manipulation Element: V ariable manipulation element manipulates and amplifies the output present) in of the variable the signal. conversion element. It also removes noise (if
  • 13. COMPONENTS OF GENERALIZED MEASUREMENT SYSTEM 4. Data Processing Element: It processes the data signal received from the variable manipulation element and produces suitable output. 5. Data Transmission System: Data Transmission System is simply used for transmitting data from one element to another. It acts as a communication link between different elements of the measurement system. 6. Data Presentation Element: It is used to present the measured physical quantity in a human readable form to the observer. LED displays are most commonly used many measurement systems. as data presentation elements in
  • 14. ntity - COMPONENTS OF GENERALIZED MEASUREMENT SYSTEM Manipulation Element Element Conversion Transmission Element Conversion Element – Pressure Sensing Element - Bulb Presentation Element Physical Qua Temperature
  • 15.
  • 16. STANDARDS  In metrology (the science of measurement), a standard is an object, or system that bears a defined relationship to a physical quantity. unit of measurement of a  Depending on functions and applications, are classified as follows: standards of measurement (i) International Standards (ii) (iii) (iv) Primary Standards Secondary Standards Working Standards
  • 17. i. International Standards  Defined by International agreement  Periodically evaluated & checked by absolute measurements in terms of fundamental units of physics  represent certain units of measurement to the closest possible accuracy attainable by the science and technology of measurement  These standards are not available to ordinary uses like measurement and calibrations.
  • 18. ii. Primary Standards  Main function is the calibration and verification of secondary standards These are maintained at the National Standards Laboratories in  different countries. New Delhi. For India, it is National Physical Laboratory at  The primary standards are not available for the use outside the National Laboratory. These primary standards are absolute standards of high accuracy that can be used as ultimate reference standards to check, calibrate and certify the secondary standards. 
  • 19. iii. Secondary Standards  Basic reference standards used by the measurement and calibration laboratories in industries These standards are maintained by the particular industry to which they belong Each industry has its own secondary standard Each laboratory periodically sends its secondary standard to the national standards laboratory for calibration and comparison against the primary standard After comparison and calibration, the National Standards Laboratory returns the secondary standards to the particular industrial laboratory     with a certification of measuring accuracy in terms of primary standards
  • 20. iv. Working Standards   main tools of a measuring laboratory used to check and calibrate laboratory instrument for accuracy and performance. For example, manufacturing of mechanical components such as  shafts, bearings, gears etc, use a standard called working standard for for checking checking the component dimensions. Example: Plug gauge is used the bore diameter of bearings.
  • 21. TYPES OF MEASUREMENTS / METHODS OF MEASUREMENTS 1. Direct Comparison 6. Contact Method 2. Indirect Comparison 7. Transposition Method 3. Comparative Method 8. Complementary Method 4. Coincidence Method 9. Deflection Method 5. Fundamental Method 10. Contactless method
  • 22. 1. Direct Method  obtained. Measurements are directly Ex.: Vernier Caliper, Scales. 2. Indirect Method  Obtained by measuring other quantities. Ex: Measurement of strain induced in a bar due to the applied force
  • 23. 3. Comparative Method It’s compared with other known value. Ex: Comparators. 4. Coincidence Method: Measurements coincide with certain lines and signals. Ex: Comparators. 5. Fundamental Method: Measuring quantity. a quantity directly in related with the definition of that
  • 24. 6. Transposition Method: Quantity balanced balanced to be measured is first by a known value and then by an other new value. Ex: Determination of mass by balancing methods. 7. Complementary Method: The value of quantity to be measured is combined with known value of the same quantity. Ex: Determination of the volume of a solid by liquid displacement Volume.
  • 25. 8. Deflection Method: The value to be measured is directly indicated by a deflection of pointer. Ex: Pressure Measurement. 9. Contact Method: Sensor/Measuring tip touch the surface area. Ex: Vernier Caliper. 10. Contactless method: There is no direct contact with the surface to be measured. E0 x 6-.Jum l-18easurement by optical instruments
  • 26. TYPES OF MEASURING INSTRUMENTS 1. Deflection and Null type instruments 2. Analog and Digital instruments 3. Active and passive type instruments 4. Automatic and manually operated instruments 5. Absolute and secondary instruments 6. Contacting and non-contacting instruments 7. Intelligent instruments
  • 27. 1. Deflection and Null type instruments  The weight of the object is indicated by the a deflection or movement of pointer Ex. on a graduated Balance scale. Spring  The effect caused by the quantity to be measured is nullified.  For example, consider the measurement of weight by an ordinary beam balance as shown in fig. The unknown weight placed in one-side causes the beam and the pointer to deflect. Ex. Beam Balance
  • 28. 2. Analog and Digital Instruments
  • 29. 3. Active and Passive Type Instruments  In active instruments, the quantity being measured just activates the magnitude of some, external power input source, which in turn produces the measurement.  In this type of instruments, another external energy input source is present apart from the quantity to be measured.  In passive type instruments, output is produced entirely measured. by the quantity being
  • 30. 4. Manual and Automatic Instruments  Manual instruments require the services of a human operator.  When the process of null balance is automated, it is known termed as automatic instruments.
  • 31. 5. Absolute and Secondary Instruments  Absolute instruments are those which give the value of the quantity to be the measured, in terms of the constants of instrument and their deflection only.  Secondary Instrument shows deflection directly in terms of electrical quantity like voltage, current, power and frequency.  These instruments are calibrated by comparison with an absolute instrument.
  • 32. 6. Contacting and Non-Contacting Instruments
  • 33. 7. Intelligent Instrument  Microprocessor are incorporated with measuring instrument
  • 34. ACCURACY AND PRECISION Accuracy is the ability of the instrument to measure the accurate value (Conformity). Precision refers to how closely individual measurements agree with each other (Repeatability).
  • 35. FACTORS AFFECTING ACCURACYAND PRECISION OF A MEASURING SYSTEM • A measuring These are: system is made of five basic elements. 1. 2. 3. 4. 5. Standard Work piece Instrument Person Environment.
  • 36. FACTORS AFFECTING ACCURACY 1. Standard OF A MEASURING SYSTEM Coefficient of thermal expansion Stability with time Elastic properties Position etc 2. Work piece: Cleanliness surface Surface defects Hidden geometry finish etc. 3. Instrument Inadequate amplification Scale error Deformation while handling Calibration error Repeatability & Readability heavy w/p
  • 37. FACTORS AFFECTING ACCURACY OF A MEASURING SYSTEM 4. Person Training skill Sense of precision appreciation Ability to select measuring instrument & standard Attitude towards personal accuracy achievement Planning for measurement technique to have consistent in precision. minimum just with 5. Environment Temperature, pressure and humidity Clean surrounding and minimum vibration Adequate illumination Temperature equalization between standard w/p & instrument
  • 38. ERRORS IN MEASUREMENT What is Error in Measurement? • Measurement Error (Observational Error) difference between a measured(actual) value true value. is the and its • True size  Theoretical size of a dimension which is free from errors. • Actual size  size obtained through measurement with permissible error.
  • 39. Errors in Measurements • Error in Measurement = Measured V alue − True V alue • The error can be expressed or evaluated either as an absolute or as a relative error. Absolute error • True absolute error - It is the algebraic difference between the result of measurement and the conventional true value of the quantity measured. • Apparent absolute error - If the series of measurement are made then the algebraic difference between one of the results of measurement and the arithmetical mean is known as apparent absolute error. Relative Error • It is the quotient of the absolute error and the value of comparison (may be true value or the arithmetic mean of a series of measurements) used for calculation of the absolute error.
  • 40. TYPES OF ERRORS IN MEASUREMENT Types of Errors in Measurements Measurement Errors Gross / Blunder Errors Systematic Errors Random Errors Theoretical Errors Instrumental Errors Observational Errors Environmental Errors
  • 41. TYPES OF ERRORS IN MEASUREMENT 1) Gross or Blunder Errors: This category of errors includes all the human mistakes while reading, recording the readings. The best example of these errors is a person or operator reading pressure gauge 1.01N/m2 as 1.10N/m2. 2)  Measurement Error: The measurement error measurement of the true is the value. result of the variation of a  Usually, Measurement error consists of a random error and systematic error.
  • 42. TYPES OF ERRORS IN MEASUREMENT a. • Systematic Error (Controllable Error) A systematic error is a constant error that under the same operating conditions. Systematic error is caused by any factors that systematically affect measurement . • • i. Classification of systematic errors: Instrumental Errors – Calibration Error ii. Environmental Errors – Temp, Pressure, Humidity iii. Observational Errors - Parallax iv. Theoretical - Percentage
  • 43. Types of Errors Controllable or Systematic errors Atype of error that deviates by a fixed amount from the true value of measurement. These types of errors are controllable in both their magnitude and their direction, and can be assessed and minimized if efforts are made to analyze them. In order to assess them, it is important to know all the sources of such errors. Examples of such errors include measurement of length using a metre scale, measurement of current with inaccurately calibrated ammeters, etc. Minimization of systematic errors increases the accuracy of measurement. I. • • • • •
  • 44. Types of Errors 1. Calibration errors These types of errors are caused due to variation in the calibrated scale from its nominal value. 2. Ambient conditions • It is essential to maintain the ambient conditions at internationally accepted values of standard temperature (20 ºC) and pressure (760 mmHg) conditions. A small difference of 10 mmHg can cause errors in the measured size of the component. • The most significant ambient condition affecting the accuracy of measurement is temperature. An increase in temperature of 1 ºC results in an increase in the length of C25 steel by 0.3 μm, and this is substantial when precision measurement is required. • Humidity, thermal gradients, vibrations, and CO2 content of the air affect the refractive index.
  • 45. Errors in measurements 3.Deformation of workpiece • Any elastic body, when subjected to a load, undergoes elastic deformation. The stylus pressure applied during measurement affects the accuracy of measurement. • Due to a definite stylus pressure, elastic deformation of the workpiece and deflection of the workpiece shape may occur. The magnitude of deformation depends on the applied load, area of contact, and mechanical properties of the material of the given workpiece. Therefore, during comparative measurement, one has to ensure that the applied measuring loads are same. Elastic deflection due to Stylus pressure
  • 46. Errors in measurements 4.Avoidable errors • These errors mainly occur due to misalignment of the work-piece centers. The error due to misalignment is caused, when the centre line of work-piece is not perpendicular to the lines of graduations or divisions engraved on scale. • Datum errors – Difference between the true value and the indicated value • Reading errors – The errors occur due to mistakes committed by the observer. Error due to poor contact Poor contact between the working gauge or a the instrument and workpiece causes an error as shown in Fig. To avoid such error, an instrument with a wide area of contact should not be used while measuring irregular or curved surfaces, and the correct contact pressure must be applied. Therefore, instrument loading error is the difference between the value of the measurand before and after the measuring system is connected or contacted for measurement..
  • 47. Errors in measurements 4.Avoidable errors Parallax error • Apparent shift in the position of an object when is it is viewed from various angles. Parallax error occurs when the pointer in a scale is not observed along a normal a line normal to the scale.
  • 48. Misalignment Error Effect of misalignment These occur due to the inherent inaccuracies present in the measuring instruments. These errors may also be due to improper use, handling, or selection of the instrument. Wear on the micrometer anvils or anvil faces not being perpendicular to the axis results in misalignment, leading to inaccurate measurements. If the alignment is not proper, sometimes sine and cosine errors also contribute to the inaccuracies of the measurement. Zero error • Datum measurement error.
  • 49.
  • 50. TYPES OF ERRORS IN MEASUREMENT Random Errors (Uncontrollable Error) • Random (or indeterminate) errors are caused by uncontrollable fluctuations in variables that affect experimental results. • Random errors are caused by the sudden change in experimental conditions and noise and tiredness in the working persons. errors are either positive or negative. These • These errors may be reduced by taking the average of a large number of readings.
  • 51. Errors in measurements II. Random errors – Uncontrollable errors • Random errors are non-consistent. Random errors provide a measure of random deviations when measurements of a physical quantity are carried out repeatedly. When a series of repeated measurements are made on a component under similar conditions, the values or results of measurements vary. Specific causes for these variations cannot be determined, since these variations are unpredictable and uncontrollable by the experimenter and are random in nature. They are of variable magnitude and may be either positive or negative. Can be evaluated statistically. Random errors can be minimized by calculating the average of a large number of observations. Random errors limit the precision of the instrument. • • • •
  • 52. Errors in measurements • Sources of Random errors: 1. 2. 3. Presence of transient fluctuations in friction in the measuring instrument. Play in the linkages of the measuring instruments. Error in operator’s judgment in reading the fractional part of engraved scale divisions. Operator’s inability to note the readings because of fluctuations during measurement. 4. 5. Positional errors associated with the measured object and standard, arising due to small variations in setting. Distinction between Systematic and Random Error
  • 53.
  • 54. Metrological terminologies True value (of a quantity) -The true value of a quantity is defined as the value consistent with its definition. This implies that there are no measurement errors in the realization of the definition. • Example - Density of a substance is defined as mass per unit volume. If mass and volume of the substance could be determined without making measurement errors, then the true value of the density can be obtained. • Unfortunately in practice both these quantities cannot be determined without experimental error. Therefore the true value of a quantity cannot be determined experimentally. Nominal value and conventional true value - The nominal value is the approximate or rounded-off value of a material measure or characteristic of a measuring instrument. • For example, when we refer to a weight as 1 kg, we are using their nominal values. Their exact values, known as conventional true values, may be 99.98 ohms and 1.0001 kg respectively. Error - The difference between the result of a measurement and its true value is known as the error of the measurement. Since a true value cannot be determined, the error as defined cannot be determined as well.A conventional true value is therefore used in practice to determine an error . • • •
  • 55. Metrological terminologies Accuracy of a measurement is the degree of its closeness of to the true value. The precision of a measurement is the degree of scatter of the measurement result, when the measurement is repeated a number of times under specified conditions. Trueness is the closeness of the mean of a set of measurement results to the actual (true) value. The field of statistics, prefers to use the terms bias and variability instead of accuracy and precision: Bias is the amount of inaccuracy and variability is the amount of imprecision. • • • • • As per ISO 5725-1
  • 56. Metrological terminologies • The vertical axis represents the frequency of the measurement result and the horizontal axis represents the values of the results (X). The central vertical line represents the mean value of all the measurement results. The vertical line marked T represents the true value of the measurand. The difference between the mean value and the T line is the accuracy of the measurement. The standard deviation (σx ) of all the measurement results about the mean value is a quantitative measure for the precision of the measurement. The line (S ) represents the mean value obtained using a higher level measurement standard. • • • • •
  • 57. Metrological Terminologies • Accuracy of an instrument is always assessed in terms of error. The difference between the true value and the mean value of the set of readings on the same component is termed as an error. Error can also be defined as the difference between the indicated value and the true value of the quantity measured. E = Vm − Vt where E is the error, Vm the measured value, and Vt the true value. Accuracy of an instrument can also be expressed as % error. If an instrument measures Vm instead of Vt , then, % error = (Vm − Vt) / Vt) x 100 • Two terms are associated with accuracy: sensitivity and consistency. Sensitivity is the ability of the measuring equipment to detect small variations in the quantity being measured. • When successive readings of the measured quantity obtained from the measuring instrument are same all the time, the equipment is said to be consistent. • Ahighly accurate instrument possesses both sensitivity and consistency.
  • 58. Metrological terminologies In engineering, precision is often taken as three times standard deviation (SD) of measurements taken, representing the range that 99.73% of measurements can occur within. Precision includes: Repeatability is the closeness of agreement (usually expressed as standard deviation) between the results of successive measurements based on the same measurand and carried out under identical measurement conditions. The measurements that are performed under the same conditions are termed repeatability, that is, same process, same operator, same apparatus (instrument) used under the same conditions, the same place, at the same time, and with the same quantitative expressions of the results, using the same equipment and within a short-time interval. Reproducibility is the closeness of agreement (usually expressed as SD) between the results of successive measurements based on the same measurand and carried out under different measurement conditions (to be specific: principle, method, operator, device, apparatus, reference material (standard), place, conditions, time (date), and quantitative expression of the result) over longer time periods. • • •
  • 60. Metrological Terminologies Accuracy, Precision and Reliability • Accuracy is a comparison: the desired result is compared with the actual measurement, thus also called “quality of conformity”. Precision reports the dispersement of results or the degree of repeatability within the manufacturing and measurement systems, thus called “quality of refinement”. Reliability shows the relationship between the predicted results and the actual. • •
  • 61. Metrological Terminologies Accuracy and Cost • The cost increases exponentially as the requirement of accuracy increases. If the tolerance of a component is to be measured, then the accuracy requirement will normally be 10% of the tolerance values. Demanding high accuracy unless it is absolutely required is not viable, as it increases the cost of the measuring equipment and hence the inspection cost. It makes the measuring equipment unreliable, because higher accuracy increases sensitivity. • • • Relationship of accuracy with cost
  • 62. Metrological terminologies • Sensitivity • The ratio of the change of instrument indication to the change of quantity being measured is termed as sensitivity. In other words, it is the ability of the measuring equipment to detect small variations in the quantity being measured. • The permitted degree of sensitivity determines the accuracy of the instrument.An instrument cannot permitted degree of sensitivity. be more accurate than the • Stability refers to the ability of a measuring instrument to constantly maintain its metrological characteristics with time.
  • 63. Metrological Terminologies Uncertainty • Uncertainty is the quantification of the level of doubt we have about any measurement. Uncertainty means how certain or uncertain are we on measured readings. No reading is 100% accurate. It has some level of uncertainty present. Uncertainty includes many factors like Personnel skill, resolution of instrument, and accuracy of instrument, etc. We should combine Uncertainty and Error to find whether given instrument is within accuracy or within tolerance or out of accuracy or tolerance. Most of the times we compare only ‘Error andAccuracy’or ‘Error and Tolerance’and Uncertainty factor is not considered. This is a wrong practice. • • • • •
  • 64. Metrological terminologies Drift is an undesired gradual departure of the instrument output over a period of time that is unrelated to changes in input, operating conditions or load. Inspection is defined as a procedure in which a part or product characteristic, such as a dimension, is examined to determine whether it conforms to the design specification. Basically, inspection is carried out to isolate and evaluate a specific design or quality attribute of a component or product. Linearity is an indicator of the consistency of measurements over the entire range of measurements. Asystem is said to be linear if proportional changes in input measurements produce proportional changes in output measurement. Magnification - In order to measure small difference in dimensions, the movement of the measuring tip in contact with work must be magnified and, therefore, the output signal from a measuring instrument is to be magnified many times to make it more readable. Range – It is defined as the difference between the lower and higher values that an instrument is able to measure. If an instrument has a scale reading of 0.01–100 mm, then the range of the instrument is 0.01–100 mm • • • • •
  • 65. Metrological terminologies • Readability refers to the ease with which the readings of a measuring instrument can be read. • Fine and widely spaced graduation lines improve the readability. • To make the micrometers more readable they are provided with vernier scale or magnifying devices. • Response time is the time required for an instrument to respond to a change in input signal. • Resolution It is the fineness to which an instrument can be read. • It is also called as discrimination and defined as the smallest increment of the input signal that a measuring system is capable of displaying. • Smallest change in a quantity being measured that causes a perceptible change in the corresponding indication.
  • 66. Calibration What is calibration? Calibration is an activity of checking / verifying the accuracy of a measuring instrument by comparison with a standard of higher accuracy (gauge block). It may also include adjustment of the instrument to bring it into alignment with the standard. Purpose is to detect, correct and document the instrument performance and uncertainty. It also determines the traceability of the measurement.
  • 67. Calibration Why calibration is important? • Degrading of instrument accuracy due to normal wear and tear, due to mechanical & thermal shock and hazardous manufacturing environment (ex - oils, metal chips etc.). • The type of instrument and the level environment may degrade very quickly or over a long period of time. • Four main reasons for having an instrument calibrated: 1. 2. To establish and demonstrate traceability. To ensure readings from the instrument are consistent with other measurements. To determine the accuracy of the instrument readings. To establish the reliability of the instrument i.e. that it can be trusted. 3. 4.
  • 68. Calibration of instruments Calibration of External micrometer Parameters to be verified: 1. 2. 3. Scale – Deviation of reading Zero error in the fully closed position Flatness of the measuring surfaces, i.e. of the anvil and the spindle; Parallelism between the measuring surfaces. 4.
  • 69. Calibration of instruments Calibration of External micrometer • Deviation of reading • The accuracy of a micrometer over its working range is determined by measuring a set of gauge blocks of appropriate class, usually grade 1 or 2, using the micrometer under calibration. The anvil faces are cleaned and the micrometer is wound forward until the anvil touch check reads zero, if not the spanner to rotate the barrel to line up datum line with zero. The micrometer is firmly held in a rigid stand and the gauge block is introduced between the anvils. The movable anvil is rotated until it comes in contact with the datum surface of the gauge block. The micrometer reading is taken and compared with the actual length of the gauge block, obtained from its calibration certificate. The procedure is repeated at least three times at each major graduation and the mean value of the reading computed. • • •
  • 70. Calibration of External micrometer Flatness of measuring faces • The flatness of measuring faces is tested by using an optical flat (Fig.)An optical flat of diameter 45 mm or 60 mm is generally used.After cleaning the measuring face of the anvil thoroughly, one surface of the optical flat is brought into contact with the measuring face. Unless the faces are perfectly flat a number of coloured interference bands will be seen on their surfaces. The shape and the number of these bands indicate the degree of flatness of the face.Aband represents a distance of 0.32 μm. Testing of flatness of a micrometer measuring face
  • 71. Calibration of External micrometer Parallelism of measuring faces • Parallelism of measuring faces is determined by either using optical parallels or gauge blocks. The optical parallel is first placed on the measuring face of the anvil and carefully moved until the bands visible on the face are reduced to a minimum. The measuring face of the spindle is then brought into contact with the optical parallel. The number of bands visible on both faces gives the parallelism of the measuring faces. For micrometers of range 50 mm and larger it is more convenient to use gauge blocks.Agauge block is placed at five positions between two measuring faces. The maximum difference of the five readings is considered as the parallelism of the measuring faces. Testing is preferably done at two angular positions of the anvil, by using two gauge blocks differing in length by 0.25 mm. • • •
  • 72. Calibration of External micrometer Requirement of a micrometer generally specified in international standards
  • 73. Calibration of Vernier calipers Deviation of reading - The deviation of reading is determined by the use of gauge blocks or end bars or by the use of a measuring machine. The deviations of reading at not less than five positions, equally spaced within the measuring range of the main scale and the vernier scale, are determined. Flatness of the measuring faces - The flatness of the measuring faces for both external and internal measurement is determined by using either a dial test indicator or an optical flat. When a dial test indicator is used, the test instrument is laid horizontally on a surface plate and the tip of the dial test indicator is traversed along the surface of the measuring face. The maximum deviation of the indicator is taken as a measure of flatness. Parallelism of the measuring faces - Parallelism of the measuring faces is determined by inserting gauge blocks at different points on the jaws or by using a measuring machine. Generally parallelism is determined at two measured lengths, mid range and close to full range. Squareness of the fixed face - The squareness of the fixed face for external measurement with the guiding edge of the beam is determined by holding a gauge block of comparable length against the edge of the beam and fixed measuring face. • • • •
  • 74. Calibration of instruments Calibration report • It provides a set of results that show the performance of the instrument being calibrated in comparison to a reference standard. Calibration certificate • It states the compliance of an instrument with the manufacturer’s specification.
  • 75. Calibration Certificate/ Report The calibration report/ certificate shall include the following additional information: a) Date of receipt of the item and date of completion of the calibration work Environmental conditions maintained during the measurements Signature and title of authorized person (authorized signatory) accepting responsibility for the report and date of issue b) c) d) Astatement of the accreditation measurement capability relevant the job under calibration Asymbol of NABL or statement or both clarifying the status of accreditation of the laboratory. The Uncertainty of measurement An evidence that the measurements are traceable to National/ International Standards through unbroken chain of Accredited Laboratories. to e) f) g)
  • 76. Uncertainty • Uncertainty is the quantification of the level of doubt we have about any measurement. Uncertainty means how certain or uncertain are we on measured readings. No reading is 100% accurate. It has some level of uncertainty present. Uncertainty includes many factors like Personnel skill, resolution of instrument, and accuracy of instrument, etc. We should combine Uncertainty and Error to find whether given instrument is within accuracy or within tolerance or out of accuracy or tolerance. Most of the times we compare only ‘Error andAccuracy’or ‘Error and Tolerance’and Uncertainty factor is not considered. This is a wrong practice. • • • • •
  • 77. Uncertainty Uncertainty –As per ISO 15189 uncertainty is defined as “a parameter associated with the result of a measurement that characterises the dispersion of the values” that could reasonably be attributed to the measured quantity. Together with a measurement result, the uncertainties makes it possible to provide a quantitative indication about the quality of the result. This helps to assess its reliability. When the uncertainty in the measurement increases from true value approach toward uncertainty approach, it leads to the reconsideration of the measurement concepts. We know that both the instruments and the measurements do not provide “this” true value. Therefore, it is possible to differentiate two categories of errors. They should be considered differently in terms. However, as no justified rule underlies the combination between systematic and random errors, it results in total error, which characterizes the measurement result. The estimated upper limit of the total error is named as uncertainty. • • • •
  • 78. Uncertainty • Estimated quality of a result is expressed as the uncertainty. • Therefore, the uncertainty becomes an essential part of a measurement result. Measurement result = (Estimate ± uncertainty), i.e., Measurement of a measurand = a numerical value + uncertainty. = 50 m ±1 m = 49 m – 51 m = (50.000 ±0.001) m = 49.999 – 50.001 a unit + an Example, length length • The uncertainty gives the limits of the range in which the “true” value of the measurand is estimated to be at a given probability.
  • 79. Uncertainty The components of measurement uncertainty grouped into two categories: TypeAevaluation – Evaluation of a component of measurement uncertainty by a statistical analysis of measured quantity values obtained under defined measurement conditions – Random effects. Type B evaluation - Evaluation of a component of measurement uncertainty determined by means other than a Type A evaluation of uncertainty – Systematic effects. Examples – Evaluation based on information • Associated with authoritative published quantity values. • Associated with the quantity value certified reference material. • Obtained from a calibration certificate • Obtained from the accuracy class of a verified instrument. • Obtained from limits deduced through personal experience. • • • •
  • 80. Sources of uncertainty in a measurement Human, Technical and Environment are the three major sources of uncertainties. The a) b) c) list of the sources are: incomplete definition of the measurand; imperfect reaIization of the definition of the measurand; nonrepresentative sampling — the sample measured may not represent the defined measurand; inadequate knowledge of the effects of environmental conditions on the measurement or imperfect measurement of environmental conditions; personal bias in reading analogue instruments; finite instrument resolution or discrimination threshold; inexact values of measurement standards and reference materials; inexact values of constants and other parameters obtained from external sources and used in the data-reduction algorithm; approximations and assumptions incorporated in the measurement method and procedure; variations in repeated observations of the measurand under apparently identical conditions. d) e) f) g) h) i) j)