For robot navigation, it is important to have a local map which gives the free space around the robot. Various sensing and fusion methods have been developed to detect the free space. In this paper we propose a new fusion approach which uses the ultrasonic sensor aided by an omni-directional vision sensor to give a grid-based free space around the robot. By use of the ultrasonic sensor, the robot can obtain conservative range information based on our nearby range filtering method. The filtering can give a more reliable result considering the sensor's problem of specular reflection. Also, by use of the special omni-directional vision sensor we developed, color and edge information can be obtained in a single picture and mapped to the ground plane by the inverse perspective transformation. Thus the range, color and edge information can all be expressed on a metric grid-based representation which forms the basis of our fusion processing. From the filtered range information in this metric representation, we obtain an initial safety index for each grid cell around the robot. From the color and edge information, uniform color regions and connectivity information are extracted and used to revise the safety index. The grid cells whose final safety index is above some given threshold will then constitute the free space around the robot. Results in an indoor cluttered environment are given which show the usefulness of our proposed sensor fusion approach.