micrometer sekrup lengkap

16
Micrometer A micrometer sometimes known as amicrometer screw gauge, is a device incorporating a calibrated screw used widely for precise measurement of small distances in mechanical engineering and machining as well as most mechanical trades, along with other metrological instruments such as dial,vernier, and digital calipers. Micrometers are often, but not always, in the form of calipers. Gascoigne's Micrometer as drawn by Robert Hooke The word micrometer is a neoclassical coinage from Greek micros, "small", and metron, "measure". The Merriam-Webster Collegiate Dictionary [1] says that English got it from French and that its first known appearance in English writing was in 1670. Neither the metre nor the micrometre nor the micrometer (device) as we know them today existed at that time. However, humans of that time did have much need for, and interest in, the ability to measure small things, and small differences; the word no doubt

Upload: kaitou-rachmat-kuroba

Post on 13-Nov-2015

46 views

Category:

Documents


2 download

DESCRIPTION

hal hal mengenai micrometer

TRANSCRIPT

Micrometer

Amicrometersometimes known as amicrometer screw gauge,is a device incorporating a calibratedscrewused widely for precise measurement of small distances inmechanical engineeringandmachiningas well as most mechanical trades, along with othermetrologicalinstruments such asdial,vernier, anddigital calipers. Micrometers are often, but not always, in the form ofcalipers.

Gascoigne's Micrometer as drawn by Robert HookeThe wordmicrometeris aneoclassical coinagefrom Greekmicros, "small", andmetron, "measure". TheMerriam-Webster Collegiate Dictionary[1]says that English got it from French and that its first known appearance in English writing was in 1670. Neither themetrenor themicrometrenor the micrometer (device) as we know them today existed at that time. However, humans of that time did have much need for, and interest in, the ability to measure small things, and small differences; the word no doubt was coined in reference to this endeavor, even if it did not refer specifically to its present-day senses.The first ever micrometric screw was invented byWilliam Gascoignein the 17th century, as an enhancement of the vernier; it was used in a telescope to measure angular distances between stars and the relative sizes of celestial objects.Basic types

Large micrometer caliper.

Another large micrometer in use.The topmost image shows the three most common types of micrometer; the names are based on their application: Outside micrometer(aka micrometer caliper), typically used to measure wires, spheres, shafts and blocks. Inside micrometer, used to measure the diameter of holes. Depth micrometer, measures depths of slots and steps.

Specialized typesEach type of micrometer caliper can be fitted with specialized anvils and spindle tips for particular measuring tasks. For example, the anvil may be shaped in the form of a segment ofscrew thread, in the form of a v-block, or in the form of a large disc. Universal micrometer setscome with interchangeable anvils, such as flat, spherical, spline, disk, blade, point, and knife-edge. The termuniversal micrometermay also refer to a type of micrometer whose frame has modular components, allowing one micrometer to function as outside mic, depth mic, step mic, etc. (often known by the brand names Mul-T-Anvil and Uni-Mike). Blade micrometershave a matching set of narrow tips (blades). They allow, for example, the measuring of a narrowo-ring groove. Pitch-diameter micrometers(akathread mics) have a matching set of thread-shaped tips for measuring the pitch diameter of screw threads. Limit micshave two anvils and two spindles, and are used like asnap gauge. The part being checked must pass through the first gap and must stop at the second gap in order to be within specification. The two gaps accurately reflect the top and bottom of thetolerancerange. Bore micrometer, typically a three-anvil head on a micrometer base used to accurately measure inside diameters. Tube micrometershave a cylindrical anvil positioned perpendicularly to a spindle and is used to measure the thickness of tubes. Micrometer stopsmicrometer heads that are mounted on the table of a manual milling machine, bedways of a lathe, or other machine tool, in place of simple stops. They help the operator to position the table or carriage precisely. Stops can also be used to actuate kickout mechanisms or limit switches to halt an automatic feed system. Ball micrometershave ball-shaped (spherical) anvils. They may have one flat and one ball anvil, in which case they are used for measuring tube wall thickness, distance of a hole to an edge, and other distances where one anvil must be placed against a rounded surface. They differ in application from tube micrometers in that they may be used to measure against rounded surfaces which are not tubes, but the ball anvil may also not be able to fit into smaller tubes as easily as a tube micrometer. Ball micrometers with a pair of balls can be used when single-tangential-point contact is desired on both sides. The most common example is in measuring the pitch diameter of screw threads (which is also donewith conical anvils or the 3-wire method, the latter of which uses similar geometry as the pair-of-balls approach). Bench micrometersare tools forinspectionuse whoseaccuracy and precisionare around half a micrometre (20 millionths of an inch, "a fifth of a tenth" in machinist jargon) and whoserepeatabilityis around a quarter micrometre ("a tenth of a tenth"). An example is thePratt & WhitneySupermicrometer brand. Digit micsare the type with mechanical digits that roll over. Digital micsare the type that uses an encoder to detect the distance and displays the result on a digital screen. V micsare outside mics with a small V-block for an anvil. They are useful for measuring the diameter of a circle from three points evenly spaced around it (versus the two points of a standard outside micrometer). An example of when this is necessary is measuring the diameter of 3-flute endmills and twist drills.[edit]Operating principles

animation of a micrometer used to measure an object(black) of length = 4.14 mmMicrometers use the principle of a screw toamplifysmall distances (that are too small to measure directly) into large rotations of the screw that are big enough to read from a scale. The accuracy of a micrometer derives from the accuracy of the thread-form that is at its heart. The basic operating principles of a micrometer are as follows:1. The amount of rotation of an accurately made screw can be directly and precisely correlated to a certain amount of axial movement (and vice versa), through the constant known as the screw'slead(/lid/). A screw'sleadis the distance it moves forward axially with one complete turn (360). (In most threads [that is, in all single-start threads],leadandpitchrefer to essentially the same concept.)2. With an appropriate lead and major diameter of the screw, a given amount of axial movement will beamplifiedin the resulting circumferential movement.For example, if the lead of a screw is 1mm, but the major diameter (here, outer diameter) is 10mm, then the circumference of the screw is 10, or about 31.4mm. Therefore, an axial movement of 1mm is amplified (magnified) to a circumferential movement of 31.4mm. This amplification allows a small difference in the sizes of two similar measured objects to correlate to a larger difference in the position of a micrometer's thimble.In classic-style analog micrometers, the position of the thimble is read directly from scale markings on the thimble and shaft. Avernier scaleis often included, which allows the position to be read to a fraction of the smallest scale mark. In digital micrometers, an electronic readout displays the length digitally on anLCD displayon the instrument. There also exist mechanical-digit versions, like the style of carodometerswherethe numbers "roll over".

The parts of a micrometer caliper, labeled. (Notice also that there is a handydecimal-fraction equivalentschart printed right on the frame of this inch-reading micrometer.)A micrometer is composed of: FrameThe C-shaped body that holds the anvil and barrel in constant relation to each other. It is thick because it needs to minimize flexion, expansion, and contraction, which would distort the measurement.The frame is heavy and consequently has a high thermal mass, to prevent substantial heating up by the holding hand/fingers. It is often covered by insulating plastic plates which further reduce heat transference.Explanation: if you hold the frame long enough so that it heats up by 10C, then the increase in length of any 10cm linear piece of steel is of magnitude 1/100mm. For micrometers this is their typical accuracy range.Micrometers typically have a specified temperature at which the measurement is correct (often 20C [68F], which is generally considered "room temperature" in a room withHVAC).Toolroomsare generally kept at 20C [68F]. AnvilThe shiny part that the spindle moves toward, and that the sample rests against. Sleeve / barrel / stockThe stationary round part with the linear scale on it. Sometimes vernier markings. Lock nut / lock-ring / thimble lockThe knurled part (or lever) that one can tighten to hold the spindle stationary, such as when momentarily holding a measurement. Screw(not seen) The heart of the micrometer, as explained under"Operating principles". It is inside the barrel. (No wonder that the usual name for the device in German isMessschraube, literally "measuring screw".) SpindleThe shiny cylindrical part that the thimble causes to move toward the anvil. ThimbleThe part that one's thumb turns. Graduated markings. Ratchet stop(not shown in illustration) Device on end of handle that limits applied pressure by slipping at a calibrated torque. Inch system

Micrometer thimble showing 0.276 inchThe spindle of an inch-system micrometer has 40 threads per inch, so that one turn moves the spindle axially 0.025inch (1 40 = 0.025), equal to the distance between two graduations on the frame. The 25 graduations on the thimble allow the 0.025inch to be further divided, so that turning the thimble through one division moves the spindle axially 0.001inch (0.025 25 = 0.001). Thus, the reading is given by the number of whole divisions that are visible on the scale of the frame, multiplied by 25 (the number of thousandths of an inch that each division represents), plus the number of that division on the thimble which coincides with the axial zero line on the frame. The result will be the diameter expressed in thousandths of an inch. As the numbers 1, 2, 3, etc., appear below every fourth sub-division on the frame, indicating hundreds of thousandths, the reading can easily be taken mentally.Suppose the thimble were screwed out so that graduation 2, and three additional sub-divisions, were visible (as shown in the image), and that graduation 1 on the thimble coincided with the axial line on the frame. The reading then would be 0.2000 + 0.075 + 0.001, or .276inch.

Metric system

Micrometer thimble reading 5.78mmThe spindle of an ordinary metric micrometer has 2 threads per millimetre, and thus one complete revolution moves the spindle through a distance of 0.5 millimeter. The longitudinal line on the frame is graduated with 1 millimetre divisions and 0.5 millimetre subdivisions. The thimble has 50 graduations, each being 0.01 millimetre (one-hundredth of a millimetre). Thus, the reading is given by the number of millimetre divisions visible on the scale of the sleeve plus the particular division on the thimble which coincides with the axial line on the sleeve.Suppose that the thimble were screwed out so that graduation 5, and one additional 0.5 subdivision were visible (as shown in the image), and that graduation 28 on the thimble coincided with the axial line on the sleeve. The reading then would be 5.00 + 0.5 + 0.28 = 5.78mm.

Vernier

Micrometer sleeve (with vernier) reading 5.783mmSome micrometers are provided with a vernier scale on the sleeve in addition to the regular graduations. These permit measurements within 0.001 millimetre to be made on metric micrometers, or 0.0001inches on inch-system micrometers.The additional digit of these micrometers is obtained by finding the line on the sleeve vernier scale which exactly coincides with one on the thimble. The number of this coinciding vernier line represents the additional digit.Thus, the reading for metric micrometers of this type is the number of whole millimetres (if any) and the number of hundredths of a millimetre, as with an ordinary micrometer, and the number of thousandths of a millimetre given by the coinciding vernier line on the sleeve vernier scale.For example, a measurement of 5.783 millimetres would be obtained by reading 5.5 millimetres on the sleeve, and then adding 0.28 millimetre as determined by the thimble. The vernier would then be used to read the 0.003 (as shown in the image).Inch micrometers are read in a similar fashion.Note: 0.01 millimetre = 0.000393inch, and 0.002 millimetre = 0.000078inch (78 millionths) or alternatively, 0.0001inch = 0.00254 millimetres. Therefore, metric micrometers provide smaller measuring increments than comparable inch unit micrometersthe smallest graduation of an ordinary inch reading micrometer is 0.001inch; the vernier type has graduations down to 0.0001inch (0.00254mm). When using either a metric or inch micrometer, without a vernier, smaller readings than those graduated may of course be obtained by visual interpolation between graduations.

Torque repeatability via torque-limiting ratchets or sleevesA micrometer reading is not accurate if the thimble is overtorqued. A useful feature of many micrometers is the inclusion of a torque-limiting device on the thimbleeither a spring-loaded ratchet or a friction sleeve. Without this device, workers may overtighten the micrometer on the work, causing the mechanical advantage of the screw to squeeze the material or tighten the screw threads, giving an inaccurate measurement. However, with a thimble that will ratchet or friction slip at a certain torque, the micrometer will not continue to advance once sufficient resistance is encountered. This results in greater accuracy and repeatability of measurementsmost especially for low-skilled or semi-skilled workers, who may not have developed the light, consistent touch of a skilled user.A micrometer closing on the workpiece presents an exaggerated analog to aninterference fit(press fit). Just as it is very easy to press a fit on an extremely small diameter allowance (say, a shaft just one tenth larger than its hole [.0001"]), it is very easy when measuring with a micrometer to squeeze the workpiece smaller (or expand the micrometer frame larger) just one tenth, albeit in completelyelasticfashion, with only a small amount of overtorquing of the thimble. This concept can seemcounterintuitiveto beginning students of machining and metrology, because our everyday concept of the rigidity (stiffness) of materials such assteeloraluminumis trained since our childhoods by the fact that, in everyday experience, the deformability of such rigid materials is entirelynegligibleso close to zero that our brains are used to thinking of it asequalto zero. But in the context of micrometer measurements, its nonzero nature becomes noticeable (viaamplification), in a way that students sometimes find surprisingly odd (counterintuitive).Calibration: testing and adjusting TestingA standard ordinary one-inch micrometer has readout divisions of .001inch and a rated accuracy of +/- .0001inch[4]("one tenth", in machinist parlance). Both the measuring instrument and the object being measured should be at room temperature for an accurate measurement; dirt, abuse, and operator skill are the main sources of error.[5]The accuracy of micrometers is checked by using them to measuregauge blocks, rods, or similar standards whose lengths are precisely and accurately known. If the gauge block is known to be 0.7500" .00005" ("seven-fifty plus or minus fifty millionths", that is, "seven hundred fifty thou plus or minus half a tenth"), then the micrometer should measure it as 0.7500". If the micrometer measures 0.7503", then it is out of calibration. Cleanliness and low torque are especially important when calibratingeach tenth (that is, ten-thousandth of an inch), or hundredth of a millimeter, "counts"; each is important. A mere spec of dirt, or a mere bit too much squeeze, obscure the truth of whether the instrument is able to read correctly. The solution is simplyconscientiousnesscleaning, patience, due care and attention, and repeated measurements (good repeatability assures the calibrator that his/her technique is working correctly).Calibration would record the error at approx 5 points along the range. Only one can be adjusted to zero. If the micrometer is in good condition, then they are allso near to zerothat the instrument seems to read essentially "dead-on" all along its range; no noticeable error is seen at any locale. In contrast, on a worn-out micrometer (or one that was poorly made to begin with), one can "chase the error up and down the range", that is,moveit up or down to any of various locales along the range, by adjusting the barrel, but one cannoteliminateit from all locales at once.Calibration can also include the condition of the tips (flat and parallel), any ratchet, and linearity of the scale.[6]Flatness and parallelism are typically measured with a gauge called an optical flat, a disc of glass or plastic ground with extreme accuracy to have flat, parallel faces, which allows light bands to be counted when the micrometer's anvil and spindle are against it, revealing their amount of geometric inaccuracy.Commercial machine shops, especially those that do certain categories of work (military or commercial aerospace, nuclear power industry, and others), are required by variousstandards organizations(such asISO,ANSI,ASME,ASTM,SAE,AIA,the U.S. military, and others) to calibrate micrometers and other gauges on a schedule (often annually), to affix a label to each gauge that gives it an ID number and a calibration expiration date, to keep a record of all the gauges by ID number, and to specify in inspection reports which gauge was used for a particular measurement.Not all calibration is an affair for metrology labs. A micrometer can be calibrated on-site anytime, at least in the most basic and important way (if not comprehensively), by measuring a high-grade gauge block and adjusting to match. Even gauges that are calibrated annually and within their expiration timeframe should be checked this way every month or two, if they are used daily. They usually will check out OK as needing no adjustment.The accuracy of the gauge blocks themselves is traceable through a chain of comparisons back to a master standard such as theinternational prototype meter. This bar of metal, like theinternational prototype kilogram, is maintained under controlled conditions at theInternational Bureau of Weights and Measuresheadquarters in France, which is one of the principalmeasurement standards laboratoriesof the world. These master standards have extreme-accuracy regional copies (kept in the national laboratories of various countries, such asNIST), and metrological equipment makes the chain of comparisons. Because the definition of the meter is now based on a light wavelength, the international prototype meter is not quite as indispensable as it once was. But such master gauges are still important for calibrating and certifying metrological equipment. Equipment described as "NIST traceable" means that its comparison against master gauges, and their comparison against others, can be traced back through a chain of documentation to equipment in the NIST labs. Maintaining this degree of traceability requires some expense, which is why NIST-traceable equipment is more expensive than non-NIST-traceable. But applications needing the highest degree of quality control mandate the cost. AdjustmentA micrometer that has been tested as describedabove, and has been found to be "off" (wrong, out of calibration), should be adjusted (recalibrated), so that it once again reads accurately. If the error originates from the parts of the micrometer beingwornout of shape and size, then adjustment back to accuracy is not possible; rather, repair (grinding, lapping, or replacing of parts) is required in order to return to full accuracy. However, it is more often the case that the parts are still OK (not yet worn out), but that they have been "knocked out" of the zeroed (calibrated) position. In this case, adjustment is possible, and is usually not difficult. On most micrometers, a smallpin spanneris used to tweak the barrel a small amount (typically 1 to 5 or thereabouts), rotating it a bit relative to the frame, so that the zero line is moved slightly relative to the screw and thimble. (There is usually a small hole on the barrel to accept the spanner's pin.) This simple maneuver is all that is needed in most cases. The barrel moves tightly enough that it stays in position once shifted. After a few months or years (the interval depending on the amount, or carefulness, of usage), it should be checked again, as it may have been knocked, or gradually squeezed, out of position again, in which case it is once again adjusted back to zero.It is good for all micrometer users to understand that althoughrepairof micrometers (grinding, lapping, and so forth) requires toolmaking and instrument-making skills that take substantial training and experience to learn,simple adjustmentback to zero is not difficult or complicated. Novice machinists, upon dropping their micrometer or banging it against the table too hard, might feel tempted to keep using it, because they assume that checking it and adjusting it is a bothersome or technically sophisticated operation. But understanding how to do accurate machining work requires understanding calibration as a simple and accessible operation that may need to be done frequently and is not a big deal. In this respect, machining is similar to chemistry and physics lab workcalibration of instruments needs the mystery removed from it, if usable results are to be assured.Zero error

the answer is =main scale + dial scale - (zero error) =4.00 + 0.29 - ( 0.15) = 4.14 mm

the answer is =main scale + dial scale - (zero error) =4.00 + 0.05 - ( -0.09) = 4.14 mmZero erroris any nonzero reading when the jaws are closed. It is the calibration error of the deviceoften caused by knocks or overstrains.The way to use a micrometer with zero error is to use the formula 'actual reading = main scale + micro scale (zero error)'. One way to appreciate this concept, and to remember the "formula", is to compare this idea totare weight. If you are measuring the weight of some water, and you have a cup containing the water on aweighing scale, you must subtract the weight of the cup itself from your measurement, if you want to know the true weight of only the water itself. Similarly, if your length measurement on your micrometer "starts at one" instead of "starting at zero", then you must remember to subtract that one from your total value to yield the true length of the object by itself.The zero-error compensation formula described above is appropriate for anyone, such as a school student, who doesn't know how, or is not allowed, to adjust (recalibrate) the micrometer to eliminate the error. In contrast, machinists in a commercial manufacturing environment, upon discovering a zero error in their micrometer, would generally not continue using it and applying the "formula". Instead, they would take a moment to adjust the micrometer back to zero (that is, recalibrate it). But everyone, whether student or machinist, shouldunderstandthe compensation concept, even if they avoidusingit."Positive zero error" refers to the fact that when the jaws of the micrometer are just closed, the reading is a positive reading away from the actual reading of 0.00mm. If the reading is 0.15mm, the zero error is referred to as +0.15mm."Negative zero error" refers to the fact that when the jaws of the micrometer are just closed, the reading is a negative reading away from the actual reading of 0.00mm. If the reading is -0.09mm, the zero error is referred to as -0.09mm.