Driven by evidence-based practice and patient-centered care, nurses have earned the faith of the American public. For two decades, nurses have been ranked the most trusted and ethical profession in the United States. Although trust is critical, what it lacks is support for the profession through federal investment in research, education, and workforce issues for long-term stability.