DEI — “diversity, equity and inclusion.” University administrators, corporate human resources facilitators and politicians of a liberal stripe all assure us that America is now, suddenly, for the first time in history, a nation of diversity, equity and inclusion . We are no longer, in this view, a white bread nation where just about everyone…