C, The knots (ie, the points corresponding to 5%, 10%, 20%, 30%, 40%, and 50% vaccination levels) were plotted based on the sensitivity for detecting death at each vaccination level as shown in Table 3. For example, given that the model with CAN scores identifies 54% of deaths at the top 10% of scores, it means that when 10% of the population has been vaccinated, 54% of subsequent deaths would be prevented assuming that the vaccine is 100% effective. Assuming the vaccine is only 90% effective, then 0.54 × 0.90 (or 48.6%) of subsequent deaths would be prevented. Therefore, at that point, the mortality rate would be 1 − 0.486 (or 51.4%) of the baseline rate, which is the point plotted corresponding to 10% vaccination. An approximately constant rate of vaccinating was assumed, with the assumption that persons within each category would be vaccinated in no particular order, rather than in the strict order of the scores within each category. Therefore, a straight line joins the knots rather than a fitted curve. The area above each curve represents the proportion of deaths prevented by vaccination with each allocation strategy compared with no vaccination, as different levels of population vaccination are reached. The area between the curves is the proportion of deaths prevented by 1 allocation strategy vs another. The actual number of deaths prevented can be calculated by using the actual number of deaths per day at the beginning (before vaccination) in a given health care system or population and the time it would take to reach a certain level of population vaccination. For example, if a system/population such as the VA has 20 deaths per day and would take 150 days (5 months) to vaccinate 50% of the population, then 3000 deaths would occur by day 150 without any vaccination, of which 63.5% (n = 1905) would be prevented by vaccination using the COVIDVax model vs only 1233 (41.1%) by the CDC-ACIP phased allocation and 1368 (45.6%) by age-based allocation.
eMethods. Imputation of Missing BMI Values
eTable 1. Coefficients of the Logistic Regression Models (With and Without CAN Score) Used to Estimate the Risk of SARS-CoV-2–Related Death During the Period From May 21 to September 30, 2020, Among All VA Enrollees Who Were at Risk as of May 21, 2020
eTable 2. Comparison of the Area Under the Receiver Operating Characteristic Curve (AUROC) of the Model to the AUROC of Using Age-Only, Charlson Comorbidity Index–Only, CAN Score–Only, or VACO Index for Predicting SARS-CoV-2–Related Death
eTable 3. Sensitivity of Model-Based Allocation at Different Observation Periods at the Time of Vaccination of 5%, 10%, 20%, 30%, 40% and 50% of the Population
eFigure. Screenshots of Data Integration Platform Executing the Model and Identifying High-Risk Persons for Vaccination Prioritization
Customize your JAMA Network experience by selecting one or more topics from the list below.
Identify all potential conflicts of interest that might be relevant to your comment.
Conflicts of interest comprise financial interests, activities, and relationships within the past 3 years including but not limited to employment, affiliation, grants or funding, consultancies, honoraria or payment, speaker's bureaus, stock ownership or options, expert testimony, royalties, donation of medical equipment, or patents planned, pending, or issued.
Err on the side of full disclosure.
If you have no conflicts of interest, check "No potential conflicts of interest" in the box below. The information will be posted with your response.
Not all submitted comments are published. Please see our commenting policy for details.
Ioannou GN, Green P, Fan VS, et al. Development of COVIDVax Model to Estimate the Risk of SARS-CoV-2–Related Death Among 7.6 Million US Veterans for Use in Vaccination Prioritization. JAMA Netw Open. 2021;4(4):e214347. doi:10.1001/jamanetworkopen.2021.4347
How can the risk of SARS-CoV-2–related death be estimated in the general population to be used for vaccination prioritization?
In this prognostic study of more than 7.6 million individuals enrolled in the Veterans Affairs health care system, a logistic regression model (COVIDVax) was developed to estimate risk of SARS-CoV-2–related death using the following 10 characteristics: sex, age, race, ethnicity, body mass index, Charlson Comorbidity Index, diabetes, chronic kidney disease, congestive heart failure, and the Care Assessment Need score. The model was estimated to save more lives than prioritizing vaccination based on age or on the US Centers for Disease Control and Prevention vaccination allocation.
These findings suggest that prioritizing vaccination based on the model developed in this study could prevent a substantial number of SARS-CoV-2–related deaths during vaccine rollout.
A strategy that prioritizes individuals for SARS-CoV-2 vaccination according to their risk of SARS-CoV-2–related mortality would help minimize deaths during vaccine rollout.
To develop a model that estimates the risk of SARS-CoV-2–related mortality among all enrollees of the US Department of Veterans Affairs (VA) health care system.
Design, Setting, and Participants
This prognostic study used data from 7 635 064 individuals enrolled in the VA health care system as of May 21, 2020, to develop and internally validate a logistic regression model (COVIDVax) that predicted SARS-CoV-2–related death (n = 2422) during the observation period (May 21 to November 2, 2020) using baseline characteristics known to be associated with SARS-CoV-2–related mortality, extracted from the VA electronic health records (EHRs). The cohort was split into a training period (May 21 to September 30) and testing period (October 1 to November 2).
Main Outcomes and Measures
SARS-CoV-2–related death, defined as death within 30 days of testing positive for SARS-CoV-2. VA EHR data streams were imported on a data integration platform to demonstrate that the model could be executed in real-time to produce dashboards with risk scores for all current VA enrollees.
Of 7 635 064 individuals, the mean (SD) age was 66.2 (13.8) years, and most were men (7 051 912 [92.4%]) and White individuals (4 887 338 [64.0%]), with 1 116 435 (14.6%) Black individuals and 399 634 (5.2%) Hispanic individuals. From a starting pool of 16 potential predictors, 10 were included in the final COVIDVax model, as follows: sex, age, race, ethnicity, body mass index, Charlson Comorbidity Index, diabetes, chronic kidney disease, congestive heart failure, and Care Assessment Need score. The model exhibited excellent discrimination with area under the receiver operating characteristic curve (AUROC) of 85.3% (95% CI, 84.6%-86.1%), superior to the AUROC of using age alone to stratify risk (72.6%; 95% CI, 71.6%-73.6%). Assuming vaccination is 90% effective at preventing SARS-CoV-2–related death, using this model to prioritize vaccination was estimated to prevent 63.5% of deaths that would occur by the time 50% of VA enrollees are vaccinated, significantly higher than the estimate for prioritizing vaccination based on age (45.6%) or the US Centers for Disease Control and Prevention phases of vaccine allocation (41.1%).
Conclusions and Relevance
In this prognostic study of all VA enrollees, prioritizing vaccination based on the COVIDVax model was estimated to prevent a large proportion of deaths expected to occur during vaccine rollout before sufficient herd immunity is achieved.
Highly efficacious vaccines against SARS-CoV-2 have received emergency use authorization from the US Food and Drug Administration (FDA).1-3 Vaccine supply is expected to be limited initially. Logistical challenges (eg, cold storage and 2-dose requirements) may further prolong the time needed to vaccinate most of the US population. The US Centers for Disease Control and Prevention (CDC) Advisory Committee on Immunization Practices (ACIP) outlined ethical principles that should guide allocation given limited supply and recommended a phased approach to vaccine allocation: phase 1a, health care personnel and long-term care facility (LTCF) residents; phase 1b, frontline essential workers and persons aged 75 years or older; phase 1c, essential workers, persons aged 65 to 74 years, and persons aged 16 to 64 years with high-risk medical conditions; and phase 2, which includes the remaining population.4-6
Prioritizing persons for vaccination according to their risk of SARS-CoV-2–related death would minimize the number of SARS-CoV-2–related deaths that would occur in the time it takes to vaccinate a large enough proportion of the US population to achieve sufficient herd immunity.7 We aimed to develop a model that estimates the risk of SARS-CoV-2–related death in the general population (the COVIDVax model) and to estimate the number of SARS-CoV-2–related deaths prevented by prioritizing vaccination based on our model vs an approach based on age alone (ie, oldest first) or based on the ACIP-recommended phases of vaccination.
We identified all persons aged 18 years or older who were alive and enrolled in the Veterans Affairs (VA) health care system as of May 21, 2020 (n = 7 655 212). We excluded data from the early months of the pandemic to increase the relevance of our model to contemporary practice. We excluded 6596 veterans who had tested positive for SARS-CoV-2 more than 30 days before May 21, 2020 (ie, before April 21, 2020). We also excluded 13 552 persons who were residents in VA LTCFs or nursing homes during the study period, because these individuals would already be getting vaccinated in phase 1a, resulting in a study cohort of 7 635 064 persons.
We used data from the VA Corporate Data Warehouse (CDW), a relational database of VA enrollees’ electronic health records (EHRs), developed by the VA Informatics and Computing Infrastructure (VINCI) to support research and clinical operations. The study was approved by the VA Puget Sound institutional review board, which granted a waiver of informed consent because this was a database-derived study. This study followed the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) reporting guideline.
Cohort members who tested positive for SARS-CoV-2 based on approved polymerase chain reaction tests and died of any cause within 30 days of their earliest positive test date were defined as having a SARS-CoV-2–related death.8,9 Deaths occurring both within and outside the VA are comprehensively captured in CDW through a variety of sources including VA inpatient files, VA Beneficiary Identification and Records Locator System (BIRLS), Social Security Administration (SSA) death files, and the Department of Defense.10
Cohort members were followed up for SARS-CoV-2–related death for 165 days (May 21 to November 2, 2020). Deaths occurring during this period were confirmed with updated death data through December 15, 2020, to allow additional time for deaths to be electronically recorded in CDW.
We only considered characteristics that are readily available in the EHR, including the following previously reported risk factors for adverse outcomes related to SARS-CoV-28,9,11-13: age, sex, self-reported race and ethnicity, urban vs rural residence (based on zip codes), body mass index (BMI; calculated as weight in kilograms divided by height in meters squared), and Charlson Comorbidity Index (CCI), calculated using the Deyo14 modification of the CCI.15 We also considered the following 8 common preexisting medical conditions identified as high-risk conditions by the CDC,16 derived using diagnostic codes: chronic kidney disease (CKD), chronic obstructive pulmonary disease (COPD), cirrhosis, congestive heart failure (CHF), diabetes, hypertension, myocardial infarction (MI), and peripheral vascular disease (PVD). Finally, we considered the Care Assessment Need (CAN) score (version 2.5, 1-year mortality model), a validated measure of 1-year mortality in VA enrollees calculated using sociodemographic characteristics, clinical diagnoses, vital signs, medications, laboratory values, and health care utilization data from the VA national EHR.17 CAN scores range from 0 (lowest risk) to 99 (highest risk), corresponding to percentiles of risk among all VA enrollees. The CAN score was recently shown to be a predictor of COVID-19–related mortality.18
The value of each predictor was ascertained before the beginning of the observation period on May 21, 2020. We identified comorbid conditions recorded at any time before May 21, 2020. Values of the CAN score and BMI within 6 months before May 21, 2020 were used. For persons included in our analysis who tested positive for SARS-CoV-2 between April 21 and May 21, 2020, baseline characteristics were ascertained before the date of the earliest SARS-CoV-2–positive test.
Values were missing in source data for the CAN score (2 571 262 [33.7%]), urban/rural location (91 621 [1.2%]), and BMI (1 504 110 [19.7%]). Missing BMI values were deterministically imputed (eMethods in the Supplement). For urban/rural location and the CAN score, we used a missing category because we considered missingness to be potentially informative in an unbiased manner (eg, missing CAN score implies the VA enrollee was not assigned to a primary care team and was not hospitalized during the 6-month look-back period). We also developed a model that did not include the CAN score, which is only available in VA enrollees, so that our model could be subsequently validated in non-VA populations.
Multivariable logistic regression was used to develop the COVIDVax model to estimate the risk of SARS-CoV-2–related death during follow-up using baseline patient characteristics. We used the following strategy to determine which of the 16 candidate predictor variables to include in our model. All variables with P < .05 in unadjusted models were evaluated for inclusion in the adjusted model and were dropped if the adjusted P > .05. Each dropped variable was reinserted sequentially during model construction and retained if it was significant; thus, we iteratively reassessed how each candidate predictor affected all others included in the model. We considered interactions between age and comorbidity but did not find they improved the model.
We contemplated using least absolute shrinkage and selection operator (LASSO) regression to perform variable selection and regularization. However, because these variables are listed by CDC-ACIP as high-risk criteria, we wanted to test each and explain why they were not included if they were dropped.
We also contemplated using machine learning modeling approaches. However, these are harder to execute in practice and might be perceived as lacking transparency (ie, the black box), which would reduce acceptability. All continuous variables (age, CCI, BMI, CAN score) were categorized as shown in Table 1. Analyses were conducted using Stata/MP version 16.1 (64-bit) statistical software (StataCorp).
We split the cohort into an early training period (May 21 to September 30, 2020) and a subsequent testing period (October 1 to November 2, 2020) to examine model stability over time and identify potentially optimistic estimates of performance. We used area under the receiver operating characteristic curve (AUROC) to assess discrimination. We calculated the sensitivity of each of the prioritization strategies described in the next section (ie, the proportion of SARS-CoV-2–related deaths that would be correctly identified and potentially prevented by an effective vaccine) at the time of vaccination of 5%, 10%, 20%, 30%, 40%, and 50% of the population. Based on these numbers, we estimated the proportional reduction in SARS-CoV-2–related deaths per day and total number of deaths that would be achieved when vaccination reached these levels, assuming that vaccination is 90% effective at preventing SARS-CoV-2–related death by preventing fatal infections or converting fatal into nonfatal infections. This assumption is reasonable for the Pfizer and Moderna vaccines, which have reported efficacies of approximately 95% against symptomatic COVID-19 infection.2,3 To assess model calibration, we calculated the ratio of expected to observed events, calibration-in the large (CITL), and calibration slope.
We compared the performance characteristics of the following prioritization strategies: (1) our COVIDVax model-based allocation, in which individuals are vaccinated sequentially based on model scores starting with the top 5% model scores, followed by the scores greater than 5% to 10%, greater than 10% to 20%, greater than 20% to 30%, greater than 30% to 40%, greater than 40% to 50%, and greater than 50%; (2) an age-based allocation, in which individuals are vaccinated in age groups starting with those 90 years and older, those 85 to younger than 90 years, those 80 to younger than 85 years, those 75 to younger than 80 years, those 70 to younger than 75 years, those 65 to younger than 70 years, those 60 to younger than 65 years, those 50 to younger than 60 years, and those 18 to younger than 50 years; and (3) the CDC-ACIP phased allocation, in which individuals are vaccinated first in phase 1b (age ≥75 years) followed by phase 1c (age 65-74 years or 18-64 years with a high-risk medical condition as listed by CDC-ACIP), followed by everyone else (phase 2). The age-based allocation was compared because it does not require any special tools to implement, and age is the factor most strongly associated with SARS-CoV-2–related mortality.9 We also considered strategies based on CCI alone, the CAN score alone, or the VA COVID-19 (VACO) Index,8 a VA-based model that estimates mortality in persons who test positive for SARS-CoV-2.
Frontline and essential workers, who are included in CDC-ACIP phases 1b and 1c respectively, cannot be easily identified and would tend to reduce the association of the CDC-ACIP strategy with SARS-CoV-2–related mortality. We assumed that these individuals would be offered vaccination in parallel with the high-risk groups under all 3 strategies with similarly high priority, thus having no net effect on the comparisons among strategies.
We evaluated whether the Palantir data integration platform,19 which the VA is currently leasing, could incorporate all necessary data streams and execute our model to produce risk scores for all VA enrollees in real-time. All VA data on the platform remain owned and governed by the VA.
Of 7 635 064 included VA enrollees, the mean (SD) and median (interquartile range) age were 66.2 (13.8) years and 68 (56-75) years, respectively, with a substantial proportion aged 65 years or older (4 426 939 [58.0%]), 75 years or older (2 024 622 [26.5%]), or 85 years or older (755 283 [9.9%]). Most VA enrollees were men (7 051 912 [92.4%]) and White individuals (4 887 338 [64.0%]), with 1 116 435 (14.6%) Black individuals and 399 634 (5.2%) Hispanic individuals. Most cohort members (4 176 288 [54.6%]) had a CCI of 1 or greater.
During the 165-day follow-up period, there were 2422 SARS-CoV-2–related deaths among cohort members (mortality, 1.92 deaths per 1 million participants per day), including 1935 deaths in the training subset and 487 deaths in the testing subset. The model was developed in the training subset. A total of 6 candidate predictor variables (urban/rural location, cirrhosis, COPD, hypertension, MI, and PVD) were eliminated using the variable selection methods described previously. The remaining 10 candidate variables (ie, sex, age, race, ethnicity, BMI, CCI, diabetes, CKD, CHF, and CAN score) were statistically significantly associated with the outcome and included in the model (Table 1). Model coefficients appear in eTable 1 in the Supplement. Because the CAN score is not available outside the VA, we also developed a model that excluded the CAN score (Table 1).
The model exhibited excellent discrimination with an AUROC of 85.3% (95% CI, 84.6%-86.1%) in the training and 83.6% (95% CI, 82.0%-85.3%) in the testing subset (Figure and Table 2). Dropping the CAN score reduced the AUROC only minimally (from 83.6% to 83.4% in the testing subset). The AUROC of a model that used only age was significantly lower (72.6%; 95% CI, 71.6%-73.6%), as was the AUROC for models that used only the CCI score, only the CAN score, or the VACO Index (eTable 2 in the Supplement). An AUROC could not be calculated for the CDC-ACIP strategy due to its broad categories. The model also performed well in subgroups defined by age, sex, race, ethnicity, and geographic region (Table 2).
Table 3 compares the sensitivity of COVIDVax-based with age-based or CDC-ACIP–based allocation strategies at different levels of vaccination. These analyses demonstrate that the model is more effective at identifying the small proportion of VA enrollees with the highest risk scores, among whom a disproportionately large number of SARS-CoV-2–related deaths occurred. For example, when 5% (approximately 382 000) or 10% ( approximately 763 000) of VA enrollees are vaccinated based on the highest model scores, then those who received vaccines would include 38.2% and 54.0% of the subsequent SARS-CoV-2–related deaths, respectively. In contrast, levels of vaccination of 5% or 10% based on age-only allocation (ie, oldest first) would include 16.0% and 29.4% of SARS-CoV-2–related deaths. We estimated that by the time 50% of the population is vaccinated, prioritizing vaccinees based on the COVIDVax model would result in 22.4% fewer SARS-CoV-2–related deaths (63.5% vs 41.1%) than an approach based on CDC-ACIP allocation phases and 17.9% fewer deaths (63.5% vs 45.6%) than an approach based on age alone (Table 3). eTable 3 in the Supplement shows that our model sensitivity was similar for identifying deaths that occurred during the entire 165-day follow-up period vs using only the first 55 or 110 days.
The Figure, C and Table 3 show how SARS-CoV-2–related mortality (deaths per day) would decline when different levels of vaccination are reached, calculated using the sensitivity values reported in Table 3. The area under each curve is the proportion of deaths that would occur, and the area above each curve is the proportion of deaths that would be prevented compared with no vaccination, in which case deaths per day are assumed to remain at baseline. The Figure, C shows that SARS-CoV-2–related mortality would decline much faster with the COVIDVax model’s prioritization, resulting in a much greater proportion of deaths prevented (Table 3).
Model calibration measures in the testing data were excellent and similar between the model with CAN, the model without CAN, and the age-based allocation strategies with ratios of expected to observed events of 93.0%, 93.7%, and 95.5%; CITL values of 0.072, 0.065, and 0.046; and calibration slopes of 0.928, 0.932, and 1.061, respectively.
The data platform was used to ingest all necessary VA data streams, execute the model, and generate a dashboard (eFigure in the Supplement) of individuals stratified by risk of SARS-CoV-2–related death for vaccination outreach. We also developed a web-based calculator that executes the COVIDVax model20 and provided all coefficients for others to execute it (eTable 1 in the Supplement).
The findings of this study suggest that a vaccination strategy that prioritizes individuals most likely to die would minimize SARS-CoV-2–related mortality during vaccine rollout. We developed a model that uses 10 baseline characteristics to estimate the risk of SARS-CoV-2–related death among more than 7.6 million VA enrollees. The model had excellent performance characteristics (AUROC, 85.3%) in a recent (May to November 2020) population-based cohort. We estimated that by the time 50% of the population is vaccinated, prioritizing vaccinees based on the model would result in 22.4% fewer SARS-CoV-2–related deaths than an approach based on CDC-ACIP allocation phases and 17.9% fewer deaths than an approach based on age alone. Our findings suggest that health care systems, such as VA, that have the capability to do so should consider implementing our model, and the CDC-ACIP should consider modification or substratification of their proposed allocation phases to better capture risk of SARS-CoV-2–related mortality.
The model demonstrated high sensitivity for SARS-CoV-2–related death even at low levels of vaccination, which would result in substantial reductions in SARS-CoV-2–related mortality even after small proportions of the population have been vaccinated (Figure, C). Table 3 and the Figure, C illustrate that we can estimate the proportion of deaths prevented only as a function of the proportion of persons vaccinated. The actual numbers of deaths prevented would be greater the longer it takes to vaccinate and the higher the absolute death rate without vaccination. For example, if it takes 150 days to vaccinate 50% of VA enrollees, in whom approximately 20 deaths per day were occurring at baseline, then 3000 (150 × 20) deaths would occur without vaccination, 1905 deaths (63.5%) would be prevented by COVIDVax-based vaccination, but only 1233 (41.1%) would be prevented by CDC-ACIP phased allocation. Alternatively, if it takes 200 days to vaccinate 50% of VA enrollees and the baseline mortality is higher at approximately 30 deaths per day (given the surge in SARS-CoV-2–related mortality since the study’s observation period), then 6000 (200 × 30) deaths would occur without vaccination, 3810 (63.5%) would be prevented by model-based vaccination and 2460 (41.1%) by CDC-ACIP phased allocation. Given that more than 2500 SARS-CoV-2–related deaths per day have been reported in the United States since December 15, 2020.21 and that vaccination rollout has been slower than expected,22 a very large number of deaths can be prevented by strategies that directly model and prioritize high-risk persons.
Many models perform well in silico but fail to be implemented because the predictor variables are not readily available or the modeling algorithms are too complicated (eg, neural network models) for real-world execution. We demonstrated that the data integration platform that the VA is currently using can ingest all necessary data streams and execute the model in real time for all current VA enrollees. We envision that this platform may be used to identify and continually update high-risk persons for vaccination outreach, to track vaccinated persons and those remaining unvaccinated, to match supply and demand for the vaccine across VA networks and facilities, and to track real-world vaccine effectiveness on a single platform. We developed a model that did not include the CAN score for use in settings other than the VA and a web-based calculator20 to estimate risk in individuals for vaccine prioritization.
Prioritization based on our model would adhere to the 4 ethical principles outlined by ACIP. It maximizes benefits (by targeting those at highest risk for vaccination), promotes justice (by identifying older adults or those with a high comorbidity burden who will require focused outreach for vaccination), mitigates health inequities (by assigning higher priority to racial and ethnic minorities directly reflecting their higher risk of mortality), and promotes transparency (by using an evidence-based model with explicit parameters). CDC-ACIP phases also include prioritization of frontline workers (phase 1b) and essential workers (phase 1c), which is unrelated to risk of SARS-CoV-2–related death but justified because of the societal impacts of these groups. These groups can still be prioritized in parallel with a model-based, risk-based prioritization strategy.
This study has limitations. Our calculations underestimate the overall vaccination benefit because they do not account for the beneficial consequences on those unvaccinated through lowered transmission. It is hard to measure and model an individual’s risk of transmitting SARS-CoV-2, but we assume that the impacts of the different risk-based allocation strategies on transmission are broadly similar given that none are aimed specifically at reducing transmission. Our model is population based because it was derived from all 7.6 million persons enrolled in VA care. However, VA enrollees are older and more likely to be male, to have comorbid conditions, and to have adverse social determinants of health. In contrast, they have access to comprehensive, high-quality health care. To determine the extent to which our model results are generalizable to other populations, the model will need to be externally validated.
In this study, we developed and internally validated a model predicting SARS-CoV-2–related mortality among all VA enrollees that can be used to prioritize persons for vaccination. The model would potentially result in substantial reductions in mortality compared with the allocation strategies currently proposed by the CDC-ACIP.
Accepted for Publication: February 11, 2021.
Published: April 6, 2021. doi:10.1001/jamanetworkopen.2021.4347
Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2021 Ioannou GN et al. JAMA Network Open.
Corresponding Author: George N. Ioannou, BMBCh, MS, Research and Development, Veterans Affairs Puget Sound Healthcare System, 1660 S Columbian Way, Seattle, WA 98108 (firstname.lastname@example.org).
Author Contributions: Dr Ioannou had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: G. Ioannou, Green, Fan, Dominitz, Backus, Locke, Eastment, Osborne, N. Ioannou, Berry.
Acquisition, analysis, or interpretation of data: G. Ioannou, Green, Fan, Dominitz, O’Hare, Backus, Osborne, N. Ioannou, Berry.
Drafting of the manuscript: G. Ioannou.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: G. Ioannou, Green, N. Ioannou, Berry.
Obtained funding: G. Ioannou.
Administrative, technical, or material support: Locke.
Supervision: G. Ioannou.
Conflict of Interest Disclosures: Dr O’Hare reported receiving grants from the National Institutes of Health, the US Centers for Disease Control and Prevention, and Veterans Affairs Health Services Research and Development; receiving travel and honoraria from Chugai Pharmaceuticals, the Japanese Society of Dialysis Therapy, the American Society of Nephrology, the Devenir Foundation, Hammersmith Hospital, NYU Lagone, and Kaiser Permanente Southern California; receiving honorarium from UpToDate; and receiving travel reimbursement from the New York Society of Nephrology, Columbia University, Albert Einstein College of Medicine, and Health and Aging Policy Fellows Program outside the submitted work. No other disclosures were reported.
Funding/Support: The study was supported using data from the Veterans Affairs COVID-19 Shared Data Resource provided by the VA Informatics and Computing Infrastructure. The study was supported in part by grant COVID19-8900-11 from the Department of Veterans Affairs, Clinical Sciences Research and Development to Dr Ioannou.
Role of the Funder/Sponsor: The funders had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
Disclaimer: The contents do not represent the views of the US Department of Veterans Affairs or the US government.
Create a personal account or sign in to: