The Physics of Sound
Sound is a vibration that propagates as an acoustic wave through a medium by means of compression and rarefaction of particles in the medium.
Speed of Sound
The wavelength, frequency, and speed of a sonic wave is governed by the relationship:
- c is the speed of sound in the medium defined as the distance traveled by the sonic wave in a medium per unit time
- λ is the wavelength which is defined as the distance traveled during one full oscillatory cycle
- f is the frequency of oscillation which is defined as the number of cycles completed each second
Importantly, the speed of sound in a medium varies widely depending on the properties of the medium. Speed of sound for a given medium is determined by the bulk modulus, B, which is a measure of a material’s resistance to compression and the density of the medium, ρ.
- B is the bulk modulus of the medium
- ρ is the density of the medium
|Material||Density (kg/m3)||c (m/s)||Acoustic Impedance (Rayls)
Key Point: Speed of sound is greatest for materials which are stiff (do not compress easily) and materials that have a low density.
Acoustic Pressure and Intensity
Acoustic rarefaction and compression cause changes in the local pressure of the medium. The pressure amplitude is defined as the difference between the maximum or minimum pressure of the wave and the average pressure of the medium in absence of the wave.
The SI unit of pressure is the Pascal (Pa) where 1Pa = 1 kg/m. In ultrasound, pressure amplitude is typically around 1 MPa which is approximately 10 times atmospheric pressure.
Acoustic intensity, loudness, is a measure of power per unit area and, in the ultrasound setting, is usually represented with units of mW/cm2. Intensity is proportional to the square of pressure (I ∝ P2).
Key Point: In ultrasound applications, the compression pressure amplitude is significantly greater than the rarefaction pressure amplitude.
Relative Intensity (Decibel (dB) Scale)
The decibel (dB) is a measure of relative intensity on a base-10 logarithmic scale.
Key Point: The loudness humans perceive is based on a logarithmic scale. The logarithmic definition of the decibel scale means that it is approximately proportional to human perception of loudness.
Acoustic Wave Interactions
Modern ultrasound equipment uses multiple sound emitters that create sound beams independently. These sound waves will interact with each other based on their phase (position in the periodic waveform), amplitude, and frequency.
As ultrasound propagates through a medium, it interacts with it. These interactions include reflection, refraction, and attenuation. The type of interaction is governed by differences in the acoustic impedance of different types of media.
Acoustic impedance (Z) is similar to the stiffness of a spring and is defined as the product of material density (p) times the speed of sound in the material (c). Acoustic impedance has the SI unit of Rayl, where 1 Rayl = 1 kg/m2s.
Acoustic impedance is a useful quantity because differences in acoustic impedance govern the flow of acoustic energy between connected media. When two connected materials have similar acoustic impedance, energy will flow from one material to another without much loss at the transition. However, when the materials have very different acoustic impedance values (e.g. soft tissue/lung interface) a great deal of acoustic energy will be reflected at the interface. This reflected acoustic energy is used to produce images in ultrasound imaging.
Reflection is the redirection of acoustic energy propagation which occurs at the interface of materials with different acoustic impedance. The angle of reflection relative to normal incidence is equal to the angle of incidence relative to normal incidence.
The fraction of acoustic energy reflected depends on the magnitude of the difference in acoustic impedance.
Refraction is the change in direction of the transmitted portion of an acoustic wave incident upon an interface. The angle of transmitted energy (θt) depends on both the speed of sound in each material (c1, c2) and on the angle of incidence (θi).
Attenuation is a loss of acoustic wave intensity due to interactions between the acoustic wave and the medium. Attenuation is measured by a change in decibel levels and is caused by the scattering and absorption of acoustic energy.
Scattering occurs either because of small non-uniformities within a medium, or because a reflective interface has a non-specular (rough) surface.
Absorption and Attenuation
Some acoustic energy is lost via absorption; in which the material converts acoustic energy to heat energy.
Key Point: Attenuation rate in soft tissue = 0.5 dB/cm/MHz.
Doppler Frequency Shift
A Doppler frequency shift is the change in frequency which occurs when an ultrasound wave travels through and is reflected by objects moving axially relative to the transducer. A Doppler shift in ultrasound is the same phenomenon that occurs when the sound of a car appears to change from a higher to a lower pitch as the car approaches, passes, and moves away from an observer. In the case of ultrasound, it is the usually the medium (blood) that is moving while the source and receiver (the transducer) is stationary.
The observed frequency can be found as follows:
- fr is the frequency at the receiver
- fs is the frequency at the source
- vr is the velocity of the medium relative to the receiver along the direction of wave propagation
- vs is the velocity of the medium relative to the source along the direction of wave propagation
Key Point: Doppler shifts only occur for motion along the direction of the wave propagation.
Ultrasound System Design and Operation
- dB = 10log(I/I0)
- Speed of sound in soft tissue = 1540 m/s
- Speed of ultrasound in bone = 4080 m/s
- Attenuation rate in tissue = 0.5 dB/cm/MHz
- Higher frequency => higher attenuation
- Typical ultrasound frequencies are around 1-20 MHz
- Typical ultrasound pressure is about 1 MPa (about 10 times the atmospheric pressure)
Ultrasound imaging uses high frequency sound waves to produce images of anatomical structures. Image formation is a four-step process:
- The transducer emits an ultrasound beam which travels through the body.
- The ultrasound beam encounters an interface between tissues of different acoustic impedance, which causes some of the ultrasound energy to be reflected back to the transducer.
- The transducer detects the reflected ultrasound energy.
- A computer uses the time between beam emission and reflection detection to determine the distance to the reflecting object. It produces an image based on the distance to reflection and the intensity of the reflection.
Ultrasound transducers are responsible for the production and detection of ultrasound waves. There are a variety of transducer designs optimized for a variety of imaging tasks. All designs include the basic components of a piezoelectric element, a damping block and a matching layer.
The piezoelectric element (often referred to as the element or crystal) is responsible for both generating and detecting ultrasound waves. When an electrical signal is applied to the piezoelectric material, the crystal mechanically deforms. This deformation is used to create the ultrasound wave. Conversely when a sonic wave returns to the piezoelectric material, it mechanically deforms the crystal again and generates a detectable electronic signal.
Lead-zirconate-titanate (PZT) is the most common piezoelectric material used in ultrasound transducers.
Transducer Resonance and Thickness
The effectiveness of a piezoelectric material at producing and detecting ultrasound waves is greatly enhanced for wavelengths in multiples of ½ of the material thickness. For these wavelengths, the material is able to resonate at its natural frequency, which amplifies the signal. Resonant medical transducers are designed to resonate at a target ultrasound frequency.
Ring down is the vibration in the piezoelectric element which continues after the electronic signal has ended. A long ring down time leads to lower bandwidth (a smaller range of produced frequencies), but also increases the spatial pulse length (SPL), which negatively impacts axial resolution.
The damping block absorbs the backward directed ultrasound energy and reduces ring down time. As previously noted, reduced ring down time will improve spatial pulse length (SPL) at the cost of increased bandwidth.
The Q-factor is the ratio of the center frequency (f0) to the bandwidth.
A high Q-factor transducer indicates a narrow bandwidth and a long SPL. “High Q” transducers are commonly used in Doppler ultrasound application, where a narrow bandwidth is needed to accurately quantify flow rate.
A low Q-factor transducer indicates a broad bandwidth and a short SPL. Most imaging applications use a “low Q” transducer, which has the short SPL needed for high spatial resolution along the beam axis.
Key Point: A “Low Q” transducer does not indicate poor imaging quality! In fact, many imaging applications use “low Q” transducers to optimize spatial resolution along the beam axis.
The matching layer provides the interface between the piezoelectric element and the surface of the patient. Matching layers have an acoustic impedance that is between the acoustic impedance of the element and that of the patient’s surface. This bridging of the impedance gap improves the efficiency of acoustic energy transmission from the transducer to the patient and back again by reducing the amount of energy reflected at the interface. Matching layers typically have a thickness of ¼ the operating wavelength of the transducer.
Resonant and Multi-Frequency Transducers
Resonant frequency transducers are very efficient at producing and detecting a narrow bandwidth of ultrasound frequencies. Resonant frequency transducers use a piezoelectric element with a thickness of ½ the desired ultrasound wavelength.
Multi-frequency transducers are designed to efficiently produce and detect a large bandwidth (range of frequencies). They consist of a piezoelectric material which is manufactured in such a way as to produce small rods on one surface. This surface is then coated with epoxy to create a flat face. An additional benefit of multi-frequency elements is that their acoustic properties are closer to that of soft tissue than standard PZT (lead-zirconate-titanate) elements, which improves acoustic transmission.
Most ultrasound systems use transducers consisting of 128 to 512 rectangular elements organized in an array. These arrays allow imaging over a large area while maintaining high lateral spatial resolution. There are two common types of arrays which are defined by their mode of operation; linear and phased.
In linear arrays, only a small group of elements are used to produce ultrasound waves, but the whole array is used to detect returning echoes. By varying the number of elements producing sound, the user can change the width of the ultrasound beam. Linear arrays may be either flat or curved (also referred to as “curvilinear”.)
Phased arrays fire all elements of the transducer in rapid succession which allows these arrays to be steered electronically. Electronic beam steering allows a transducer to scan over an area without the user having to physically move the transducer.
The Ultrasound Beam
After leaving the transducer, the ultrasound beam first converges then diverges. The converging portion of the ultrasound beam is known as the near field, while the diverging portion is called the far field.
The near field, also called the Freznel zone, converges because of wave interference at the periphery of the beam. This interference also causes complex variations in acoustic pressure amplitude within this region. The length of the near field is dependent on the radius of the transducer and on the ultrasound wavelength.
The far field, also called the Fraunhofer zone, is the diverging region of the acoustic beam. In the far field, the acoustic pressure intensity decreases continually. The angle of beam divergence is again determined by the radius of the transducer and the ultrasound wavelength.
Side and Grating Lobes
Side lobes and grating lobes are unwanted ultrasound energy emitted laterally to the main beam. Side lobes are caused by the radial expansion of the transducer and grating lobes are a consequence of a non-continuous transducer surface composed of discrete elements. The misdirected energy of side and grating lobes can cause false echoes to be recorded, resulting in imaging artifacts.
Ultrasound Imaging Modes
Echo Display Modes
A-mode displays echo amplitude as a function of distance. It was historically used to determine the distance to objects of different intensities, such as brain tumors. Today, A-mode has limited use because it does not produce a readily understood image. It is primarily used in ophthalmology distance measurements.
M-mode uses the grey scale information produced in B-mode to produce a graph of object motion over time. M-mode is commonly used in cardiac imaging to assess the function of the myocardium and valve leaflets. M-mode imaging in cardiology is often combined with Doppler imaging to assess blood flow within the heart.
Doppler display mode images fluid motion using the Doppler shift induced by moving reflectors, such as blood cells. The velocity of flow can be determined as follows:
- vflow is the fluid flow velocity
- fi is the emitted frequency
- fd is the Doppler shifted frequency
- θ is the angle between the direction of wave propagation and fluid flow
Ultrasound Imaging Quality
Image Quality Phantom
A phantom is a device used to asses the quality of ultrasound images. Most ultrasound phantoms consist of a water like substance with high and low density objects embedded at known locations.
Axial resolution refers to the ability to distinguish closely spaced objects along the direction of beam propagation. The minimal object spacing that can be resolved is ½ spatial pulse length (SPL). SPL is determined by the number of pulses emitted per cycle, the wavelength, and ring down time.
Typical Axial resolution = 0.5 mm for a 5 MHz beam.
Highest axial resolution is achieved with:
- Low Q-factor (heavy damping)
- Low wavelength/high frequency
Lateral spatial resolution, sometimes called azimuthal resolution, is the ability to discern two closely spaced objects oriented perpendicular to the beam. As the beam converges and then diverges, lateral spatial resolution is dependent upon the distance to the object. Lateral spatial resolution of linear and curvilinear arrays can be varied by changing the number of elements firing as a group.
Highest lateral resolution is achieved with:
- Objects located at the end of the near field
- Small groups of elements firing
Slice thickness, sometimes called elevational resolution, is the width of the imaging plane. Slice thickness depends on the height of the transducer element and is usually the lowest resolution axis. This low spatial resolution leads to partial volume averaging, in which the intensity of a pixel is the average of all reflection intensities within a slice.
Some arrays, called multi-linear arrays, have multiple rows of elements which allow the transducer to adjust slice thickness by adjusting the number of rows firing.
Contrast resolution refers to the ability to discern objects of differing intensities. This is dependent upon the strength of reflection (returning signal) and the image noise. The concepts of return signal and noise are often combined into a single value called the signal-to-noise ratio (SNR).
Contrast resolution is highest for:
- Variables which maximize the strength of an acoustic echo:
- Objects with very different acoustic impedance to their surroundings
- Variables which reduce partial volume averaging:
- Higher spatial resolution
- Higher temporal resolution - minimizing impact of motion
- Variables which maximize signal-to-noise ratio:
- Use of higher ultrasound power requiring less electronic signal amplification
- Superficial objects subject to less attenuation and returning scatter
Key point: The Rose Criteria states that an SNR of 5 or more is needed to identify image features with certainty. An SNR of less than 5 indicates that there will be difficulty and uncertainty in identifying low contrast image features.
Temporal Resolution (Frame rate)
The temporal resolution, also referred to as frame rate, of a 2D image is the time required to acquire all A-lines used in the B-mode image.
- c is the speed of sound in tissue
- N(A-lines) are the number of A-lines required to produce a B-mode 2D image
- D is the depth of scan; 2D is used because the ultrasound wave must travel to the depth of scan and back to the transducer
Highest temporal resolution is achieved with:
- A low number of A-lines
- A low depth of scan
Common Ultrasound Artifacts
Appearance: The image displays an incorrect location of an object.
Cause: Refraction due to variations in the speed of sound at the interface of the tissues causes displacement of the returning echoes.
Side-lobe energy emission
Appearance: Often manifests itself as an apparent signal which disappears when the transducer orientation is rotated.
Cause: Anatomy outside of the beam is mapped into the main beam.
This is the sample version of the full quiz. Log in or register to gain access to the full quiz.
Not a Premium Member?
Sign up today to get access to hundreds of ABR style practice questions.