Most U.S. employers will require their workers to get COVID-19 vaccines by the end of the year, according to a new survey.