The Sharpe ratio is a risk-adjusted performance measure used to evaluate the return of an investment portfolio or an individual security relative to its level of risk. It was developed by Nobel laureate William F. Sharpe and is widely used by investors to compare the risk-adjusted returns of different investment opportunities.
The Sharpe ratio is calculated as the excess return of the investment over the risk-free rate (such as the return on government bonds), divided by the standard deviation of the investment's returns. In other words, it measures the excess return earned for each unit of risk taken on by an investor. A higher Sharpe ratio indicates better risk-adjusted returns.
Imagine there are two investment opportunities: Stock A and Stock B. Stock A has an annual return of 10% with a standard deviation of 15%, and Stock B has an annual return of 8% with a standard deviation of 10%. The risk-free rate is 2%.
To calculate the Sharpe ratio for each stock, we subtract the risk-free rate from the stock's annual return and then divide by the stock's standard deviation. For Stock A, the Sharpe ratio would be (10% - 2%) / 15% = 0.53. For Stock B, the Sharpe ratio would be (8% - 2%) / 10% = 0.60.
In this example, Stock B has a higher Sharpe ratio, indicating that it provides better risk-adjusted returns than Stock A. While Stock A has a higher return, it also has a higher level of risk, as indicated by its higher standard deviation, and therefore provides a lower risk-adjusted return