There are about 253 million people with visual impairment worldwide. Many of them use a white cane and/or a guide dog as the mobility tool for daily travel. Despite decades of efforts, electronic navigation aid that can replace white cane is still research in progress. In this paper, we propose an RGB-D camera based visual positioning system(VPS) for real-time localization of a robotic navigation aid(RNA) in an architectural floor plan for assistive navigation. The core of the system is the combination of a new 6-DOF depth-enhanced visual-inertial odometry(DVIO) method and a particle filter localization(PFL)method. DVIO estimates RNA's pose by using the data from an RGB-D camera and an inertial measurement unit(IMU). It extracts the floor plane from the camera's depth data and tightly couples the floor plane, the visual features(with and without depth data), and the IMU's inertial data in a graph optimization framework to estimate the device's 6-DOF pose. Due to the use of the floor plane and depth data from the RGB-D camera, DVIO has a better pose estimation accuracy than the conventional VIO method. To reduce the accumulated pose error of DVIO for navigation in a large indoor space, we developed the PFL method to locate RNA in the floor plan. PFL leverages geometric information of the architectural CAD drawing of an indoor space to further reduce the error of the DVIO-estimated pose. Based on VPS, an assistive navigation system is developed for the RNA prototype to assist a visually impaired person in navigating a large indoor space. Experimental results demonstrate that: 1)DVIO method achieves better pose estimation accuracy than the state-of-the-art VIO method and performs real-time pose estimation(18 Hz pose update rate) on a UP Board computer; 2)PFL reduces the DVIO-accrued pose error by 82.5% on average and allows for accurate wayfinding(endpoint position error ≤45 cm) in large indoor spaces.