The dispersal of organisms plays an important role in determining the dynamics of ecological models. Ecologically, it is of interest in understanding how dispersal strategy influences the distribution of populations. An ideal free distribution (IFD) of populations has been used to predict the distribution of organisms among patches, where a key assumption is to assume that species can move freely between patches without paying any cost. If instead one assumes that there are losses when species moves from one patch to another, then ideal free distributions may not appear. In this note, we examine a two-patch resident-mutant model with travel loss and predict the optimal dispersal strategy for resident and mutant. Moreover, such strategy which produces a non-IFD is evolutionarily stable. Some same and different features of patch models with travel loss are discussed.