Combing audiovisual signals across modalities can facilitate human perception when communicate with each other in a noisy environment, but robot can’t realize communication with human in a complex environment. Recent studies showed that audiovisual integration effects occurred for low intensity visual inputs, and whether visual spatial frequency can influence audiovisual integration is unclear. In order to investigate this, we design the audiovisual integration experiment used a high frequency auditory stimuli and low intensity visual spatial frequency stimuli. Participants were instructed to maintain central fixation and to make a speeded button response with their right index finger when a stimulus in either sensory modality (unimodal or bimodal) was detected. The results showed that low intensity visual facilitate the audiovisual integration, but there were not different among spatial frequency. These findings provide unique insight into how the brain processes an auditory stimuli and visual signal of different frequencies, and will provide some basic data for communication between human and robot.