XdVAR

This module defines methods for classical variational data assimilation such as 3D- / 4D-VAR. Primal cost functions are defined, with their implicit differentiation performed with automatic differentiation with JuliaDiff methods. Development of gradient-based optimization schemes using automatic differentiation is ongoing, with future development planned to integrate variational benchmark experiments.

The basic 3D-VAR cost function API is defined as follows

    D3_var_cost(x::VecA(T), obs::VecA(T), x_background::VecA(T), state_cov::CovM(T),
    obs_cov::CovM(T), kwargs::StepKwargs) where T <: Real

where the control variable x is optimized, with fixed hyper-parameters defined in a wrapping function passed to auto-differentiation.

Methods

DataAssimilationBenchmarks.XdVAR.D3_var_NewtonOpMethod
D3_var_NewtonOp(x_bkg::VecA(T), obs::VecA(T), state_cov::CovM(T), H_obs::Function,
    obs_cov::CovM(T), kwargs::StepKwargs) where T <: Float64

Computes the local minima of the three-dimension variational cost function with a static background covariance using a simple Newton optimization method

x_bkg is the initial state proposal vector, obs is to the observation vector, state_cov is the background error covariance matrix, H_obs is a model mapping operator for observations, obs_cov is the observation error covariance matrix, and kwargs refers to any additional arguments needed for the operation computation.

return  x
DataAssimilationBenchmarks.XdVAR.D3_var_costMethod
D3_var_cost(x::VecA(T), obs::VecA(T), x_bkg::VecA(T), state_cov::CovM(T),
    H_obs::Function, obs_cov::CovM(T), kwargs::StepKwargs) where T <: Real

Computes the cost of the three-dimensional variational analysis increment from an initial state proposal with a static background covariance

x is a free argument used to evaluate the cost of the given state proposal versus other proposal states, obs is to the observation vector, x_bkg is the initial state proposal vector, state_cov is the background error covariance matrix, H_obs is a model mapping operator for observations, and obs_cov is the observation error covariance matrix. kwargs refers to any additional arguments needed for the operation computation.

return  0.5*back_component + 0.5*obs_component
DataAssimilationBenchmarks.XdVAR.D3_var_gradMethod
D3_var_grad(x::VecA(T), obs::VecA(T), x_bkg::VecA(T), state_cov::CovM(T),
    H_obs::Function, obs_cov::CovM(T), kwargs::StepKwargs) where T <: Float64

Computes the gradient of the three-dimensional variational analysis increment from an initial state proposal with a static background covariance using a wrapper function for automatic differentiation

x is a free argument used to evaluate the cost of the given state proposal versus other proposal states, obs is to the observation vector, x_bkg is the initial state proposal vector, state_cov is the background error covariance matrix, H_obs is a model mapping operator for observations, and obs_cov is the observation error covariance matrix. kwargs refers to any additional arguments needed for the operation computation.

wrap_cost is a function that allows differentiation with respect to the free argument x while treating all other hyperparameters of the cost function as constant.

return  ForwardDiff.gradient(wrap_cost, x)
DataAssimilationBenchmarks.XdVAR.D3_var_hessianMethod
D3_var_hessian(x::VecA(T), obs::VecA(T), x_bkg::VecA(T), state_cov::CovM(T),
    H_obs::Function, obs_cov::CovM(T), kwargs::StepKwargs) where T <: Float64

Computes the Hessian of the three-dimensional variational analysis increment from an initial state proposal with a static background covariance using a wrapper function for automatic differentiation

x is a free argument used to evaluate the cost of the given state proposal versus other proposal states, obs is to the observation vector, x_bkg is the initial state proposal vector, state_cov is the background error covariance matrix, H_obs is a model mapping operator for observations, and obs_cov is the observation error covariance matrix. kwargs refers to any additional arguments needed for the operation computation.

wrap_cost is a function that allows differentiation with respect to the free argument x while treating all other hyperparameters of the cost function as constant.

return  ForwardDiff.hessian(wrap_cost, x)