“Existential risk” refers to the risk that the human race as a whole might be annihilated. In other words: human extinction risk, or species-level genocide. This is an important concept because, as terrible at it would be if 90% of the human race were annihilated, wiping out 100% is a whole different matter.
Existential risk is not a fully well defined notion, because as transhumanist technologies advance, the border between human and nonhuman becomes increasingly difficult to distinguish. If humans somehow voluntarily “transcend” their humanity and become superhuman, this seems a different sort of scenario than everyone being nuked to death. However, philosophical concerns aside, there are sufficiently many clear potential avenues to human extinction to make the “existential risk” concept valuable -- including nanotech arms races, risks associated with unethical superhuman AIs, and more mundane risks involving biological or nuclear warfare. While one doesn’t wish to approach the future with an attitude of fearfulness, it’s also important to keep our eyes open to the very real dangers that loom.