This function is a wrapper around glmnet::glmnet()
that uses a column from
x
as an offset.
Usage
glmnet_offset(
x,
y,
family,
offset_col = "offset",
weights = NULL,
lambda = NULL,
alpha = 1
)
Arguments
- x
Input matrix
- y
Response variable
- family
A function or character string describing the link function and error distribution.
- offset_col
Character string. The name of a column in
data
containing offsets.- weights
Optional weights to use in the fitting process.
- lambda
A numeric vector of regularization penalty values
- alpha
A number between zero and one denoting the proportion of L1 (lasso) versus L2 (ridge) regularization.
alpha = 1
: Pure lasso modelalpha = 0
: Pure ridge model
Value
A glmnet
object. See glmnet::glmnet()
for full details.
Details
Outside of the tidymodels
ecosystem, glmnet_offset()
has no advantages
over glmnet::glmnet()
since that function allows for offsets to be
specified in its offset
argument.
Within tidymodels
, glmnet_offset()
provides an advantage because it will
ensure that offsets are included in the data whenever resamples are created.
The x
, y
, family
, lambda
, alpha
and weights
arguments have the
same meanings as glmnet::glmnet()
. See that function's documentation for
full details.
Examples
us_deaths$off <- log(us_deaths$population)
x <- model.matrix(~ age_group + gender + off, us_deaths)[, -1]
glmnet_offset(x, us_deaths$deaths, family = "poisson", offset_col = "off")
#>
#> Call: glmnet::glmnet(x = x, y = y, family = family, weights = weights, offset = offsets, alpha = alpha, lambda = lambda)
#>
#> Df %Dev Lambda
#> 1 0 0.00 159700
#> 2 1 19.30 145600
#> 3 1 29.40 132600
#> 4 1 35.78 120800
#> 5 1 40.18 110100
#> 6 2 44.76 100300
#> 7 2 52.46 91410
#> 8 2 58.11 83290
#> 9 2 62.41 75890
#> 10 2 65.75 69150
#> 11 2 68.40 63010
#> 12 3 70.59 57410
#> 13 4 73.19 52310
#> 14 4 76.34 47660
#> 15 4 78.94 43430
#> 16 4 81.10 39570
#> 17 4 82.91 36050
#> 18 4 84.42 32850
#> 19 4 85.70 29930
#> 20 4 86.77 27270
#> 21 5 87.80 24850
#> 22 5 88.77 22640
#> 23 6 89.91 20630
#> 24 6 91.14 18800
#> 25 6 92.19 17130
#> 26 6 93.08 15610
#> 27 6 93.85 14220
#> 28 6 94.50 12960
#> 29 6 95.05 11810
#> 30 6 95.52 10760
#> 31 6 95.92 9802
#> 32 6 96.27 8931
#> 33 6 96.56 8138
#> 34 6 96.80 7415
#> 35 6 97.01 6756
#> 36 6 97.19 6156
#> 37 6 97.34 5609
#> 38 7 97.48 5111
#> 39 7 97.75 4657
#> 40 7 98.00 4243
#> 41 7 98.21 3866
#> 42 7 98.39 3523
#> 43 7 98.55 3210
#> 44 7 98.69 2925
#> 45 7 98.81 2665
#> 46 7 98.91 2428
#> 47 7 99.00 2212
#> 48 6 99.08 2016
#> 49 6 99.13 1837
#> 50 6 99.17 1674
#> 51 6 99.21 1525
#> 52 6 99.24 1389
#> 53 6 99.26 1266
#> 54 6 99.29 1153
#> 55 6 99.30 1051
#> 56 6 99.32 958
#> 57 6 99.33 873
#> 58 7 99.36 795
#> 59 7 99.38 724
#> 60 7 99.41 660
#> 61 7 99.42 601
#> 62 7 99.44 548
#> 63 7 99.45 499
#> 64 7 99.46 455
#> 65 7 99.47 414
#> 66 7 99.48 378
#> 67 7 99.49 344
#> 68 7 99.49 314
#> 69 7 99.50 286
#> 70 7 99.50 260
#> 71 7 99.51 237
#> 72 7 99.51 216
#> 73 7 99.51 197
#> 74 7 99.51 179
#> 75 7 99.52 164