Sounds about right. I really don't understand why so many employers require college degrees for jobs that don't require the skills from the degrees. A college degree today basically means the person has been taught to be entitled, think they are too good for the job, and think they are a victim. I don't need any of those things for my positions.