Log-normal linear models are widely applied, and in many situations one is interested in predicting the response variable at the original scale for given covariate values. The back-transform (BT) prediction interval is universally used in practice. This study constructs a prediction interval of the response variable based on the highest density (HD) of the log-normal distribution. The simulation results show that the HD prediction intervals have reasonable coverage rates and indeed improve the intervals' length over the BT prediction intervals, particularly for the cases of small sample sizes. An example is used to illustrate the implementation of the HD prediction interval.