Patient's Motion During Acquisition

Scanner
  B0: Float64 1.5
  B1: Float64 1.0e-5
  Gmax: Float64 0.06
  Smax: Int64 500
  ADC_Δt: Float64 2.0e-6
  seq_Δt: Float64 1.0e-5
  GR_Δt: Float64 1.0e-5
  RF_Δt: Float64 1.0e-6
  RF_ring_down_T: Float64 2.0e-5
  RF_dead_time_T: Float64 0.0001
  ADC_dead_time_T: Float64 1.0e-5

It can also be interesting to see the effect of the patient's motion during an MRI scan. For this, Koma provides the ability to add motion <: MotionModel to the phantom. In this tutorial, we will show how to add a SimpleMotion model to a 2D brain phantom.

First, let's load the 2D brain phantom used in the previous tutorials:

obj = brain_phantom2D()
6506-element Vector{Float64}:
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 ⋮
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0

The SimpleMotion model includes a list of SimpleMotionType's, to enabling mix-and-matching simple motions. These are Translation, Rotation, HeartBeat and their periodic versions PeriodicTranslation, PeriodicRotation and PeriodicHeartBeat.

Head Translation

In this example, we will add a Translation of 2 cm in x, with duration of 200 ms (v = 0.1 m/s):

obj.motion = SimpleMotion(
    Translation(t_start=0.0, t_end=200e-3, dx=2e-2, dy=0.0, dz=0.0)
)
"../assets/5-phantom1.html"
6-dimensional AxisArray{ComplexF32,6,...} with axes:
    :x, (0.0:2.323232421875:230.00000976562498) mm
    :y, (0.0:2.3232322692871095:229.99999465942383) mm
    :z, (0.0:1.0:0.0) mm
    :echos, 1:1
    :coils, 1:1
    :repetitions, 1:1
And data, a 100×100×1×1×1×1 Array{ComplexF32, 6}:
[:, :, 1, 1, 1, 1] =
  0.00411741-0.000459966im  …  -0.000979727-0.00194244im
 -0.00397586+0.000502879im       0.00102613+0.00183405im
  0.00408081-0.000459376im      -0.00096417-0.00167234im
 -0.00409816+0.000416007im      0.000912704+0.00156258im
  0.00412355-0.000364033im     -0.000911542-0.00146828im
 -0.00416784+0.000323757im  …   0.000889529+0.00140031im
  0.00424077-0.000273949im     -0.000850094-0.00132442im
 -0.00428864+0.000225965im      0.000839065+0.00125563im
  0.00437809-0.000157755im     -0.000782984-0.00117041im
   -0.004453+9.1916f-5im        0.000772066+0.00112121im
            ⋮               ⋱  
  -0.0181784-0.0433027im         -0.0524268-0.0343467im
  -0.0396916-0.0153405im         -0.0168714-0.00944075im
  0.00537006+0.00620384im        -0.0101922-0.00133707im
 -0.00133504-0.00793588im        0.00643797+0.00535665im
  0.00671631-0.00259308im   …   -0.00535468+0.000499765im
 -0.00423041-0.000388917im       0.00210561+0.00344973im
  0.00403845-0.000299996im      -0.00150814-0.00272231im
 -0.00402142+0.000420273im       0.00125792+0.00238474im
  0.00403868-0.000469937im      -0.00113015-0.00214595im

If we simulate an EPI sequence with acquisition duration (183.989 ms) comparable with the motion's duration (200 ms), we will observe motion-induced artifacts in the reconstructed image.

"../assets/5-recon1.html"

The severity of the artifacts can vary depending on the acquisition duration and $k$-space trajectory.

Motion-Corrected Reconstruction

To correct for the motion-induced artifacts we can perform a motion-corrected reconstruction. This can be achieved by multiplying each sample of the acquired signal $S(t)$ by a phase shift $\Delta\phi_{\mathrm{corr}}$ proportional to the displacement $\boldsymbol{u}(t)$ [Godenschweger, 2016]:

\[S_{\mathrm{MC}}\left(t\right)=S\left(t\right)\cdot\mathrm{e}^{\mathrm{i}\Delta\phi_{\mathrm{corr}}}=S\left(t\right)\cdot\mathrm{e}^{\mathrm{i}2\pi\boldsymbol{k}\left(t\right)\cdot\boldsymbol{u}\left(t\right)}\]

In practice, we would need to estimate or measure the motion before performing a motion-corrected reconstruction, but for this example, we will directly use the displacement functions $\boldsymbol{u}(\boldsymbol{x}, t)$ defined by obj.motion::SimpleMotion. Since translations are rigid motions ($\boldsymbol{u}(\boldsymbol{x}, t)=\boldsymbol{u}(t)$ no position dependence), we can obtain the required displacements by calculating $\boldsymbol{u}(\boldsymbol{x}=\boldsymbol{0},\ t=t_{\mathrm{adc}})$.

sample_times = get_adc_sampling_times(seq1)
displacements = hcat(get_spin_coords(obj.motion, [0.0], [0.0], [0.0], sample_times)...)
"../assets/5-displacements.html"

We can now get the necessary phase shift for each sample:

_, kspace = get_kspace(seq1)
ΔΦ = 2π*sum(kspace .* displacements, dims=2)
10000×1 Matrix{Float64}:
  -2.2169475934261254
  -2.1734857088856887
  -2.1299691879523697
  -2.0863980306261674
  -2.042772236907082
  -1.9990918067951147
  -1.9553567402902488
  -1.9115670373925155
  -1.8677226981018997
  -1.8238237224184013
   ⋮
 -20.817605789660078
 -21.320395829342214
 -21.823240505415836
 -22.32613981788374
 -22.829093766744524
 -23.33210235199679
 -23.835165573643348
 -24.338283431681386
 -24.841455926113696

And apply it to the acquired signal to correct its phase:

acq1.kdata[1] .*= exp.(im*ΔΦ)
"../assets/5-recon2.html"

Finally, we compare the original image ▶️ and the motion-corrected reconstruction ⏸️:


This page was generated using Literate.jl.