I don't understand why VaR(equity) decreases when PD increases holding correlation constant.
VaR=Mean-Z*sigma. If PD increase , mean will decrease, thus VaR will decrease.
The instructor suggest that increase in PD will cause a decrease in standard deviation. I think that will only occur when the expected loss almost covers the entire equity trench.