How are earthquakes measured?
A seismometer is an instrument that senses the earth's motion; a seismograph combines a seismometer with recording equipment to obtain a permanent record of the motion. From this record scientists can calculate how much energy was released in an earthquake, which is one way to decide its magnitude. Calculations are made from several different seismograms, both close to and far from an earthquake source to determine its magnitude. Calculations from various seismic stations and seismographs should give the same magnitude, with only one magnitude for any given earthquake.
To determine the
strength and location of earthquakes, scientists use a recording instrument
known as a seismograph. A seismograph is equipped with sensors called
seismometers that can detect ground motions caused by seismic waves from both
near and distant earthquakes. Some seismometers are capable of detecting ground
motion as small as 1 billionth of a meter, or about 40 billionth of an inch.
A seismograph produces wavy lines that reflect the size of seismic waves passing beneath it. The record of the wave, called a seismogram, is imprinted on paper, film, or recording tape or is stored and displayed by computers
The Richter scale is a standard scale used to compare earthquakes. It is a logarithmic scale, meaning that the numbers on the scale measure factors of 10. So, for example, an earthquake that measures 4.0 on the Richter scale is 10 times larger than one that measures 3.0. On the Richter scale, anything below 2.0 is undetectable to a normal person and is called a microquake. Microquakes occur constantly. Moderate earthquakes measure less than 6.0 or so on the Richter scale. Earthquakes measuring more than 6.0 can cause significant damage. The maximum quake rating ever measured is about 8.9.
The Modified Mercalli Intensity Scale uses Roman Numerals from I to XII to describe different earthquake effects is commonly used.