Union jobs are essential to building a strong country. The higher wages and job security that unions provide allow workers to pursue the American dream. It allows people to work with dignity and respect, and be treated fairly in the workplace. Union wages let you build a home, a family and send your kids to college.