Purpose: To develop a fluoroscopy imaging based approach that can determine the magnitude and phase variation in respiratory motion against the treatment planning images in order to quantify the online patient setup deviation in thoracic cancer IGRT for the real time adaptive patient positionadjustment. Methods: A numerical phantom was generated to test the strength of the approach. The 2D phantom consist a static outside wall and an inner target moving in temporal pattern similar to the respiratory motion. Four motion parameters: frequency (0.5Hz∼2Hz), amplitude (40∼60mm), position offset (5∼10mm), and phase shift (starting phase 0, 45, and 90 degrees) vary to generate phantom image sets with different motions. White noise of level SNR=5 was added to all phantom images to simulate the impact of clinical image quality. A manifold based machine learning technique was used to construct the respiratory motion model under thestandard condition (frequency 1 Hz, amplitude 50mm, and no position offset and/or phase shift). Then the phase and position shift in other phantom image series were quantified by finding the MAP solution to fit theembedded motion to the standard motion model. Results: The proposed approach can detect the variation in motion patterns between two image sets. The method is insensitive to frequency changes and image noise up to SNR=5, but is very effective at capturing and quantifying the change in motion amplitude, position shift, and also the phase shift. Conclusions: This work proposed an effective mathematical approach to quantify the difference in motion between the pre‐treatment images and the treatment planning images by extraction of the motion model in fluoroscopy images using the machine learning technique. By applying the approach during online patient setup, the positioning deviation can be separated from the respiratory motion and adjusted to minimize the normal tissue toxicity in gated IGRT.
All Science Journal Classification (ASJC) codes
- Radiology Nuclear Medicine and imaging