A surface-based vacant space detection for an intelligent parking lot

Ching-Chun Huang*, Yu Shu Dai, Sheng-Jyh Wang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

15 Scopus citations

Abstract

We proposed a surface-based vacant parking space detection system. Unlike many car-oriented or space-oriented methods, the proposed system is parking-lot-oriented. In the system, we treat the whole parking lot as a structure consisting of plentiful surfaces. A surface-based hierarchical framework is then proposed to integrate the 3-D scene information with the patch-based image observation for the inference of vacant space. To be robust, the feature vector of each image patch is extracted based on the Histogram of Oriented Gradients (HOG) approach. By incorporating these texture features into the proposed probabilistic models, we could systematically infer the optimal hypothesis of parking statuses while dealing with occlusion effect, shadow effect, perspective distortion, and fluctuation of lighting condition in both day time and night time.

Original languageEnglish
Title of host publication2012 12th International Conference on ITS Telecommunications, ITST 2012
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages284-288
Number of pages5
ISBN (Print)9781467330701
DOIs
StatePublished - 2012
Event2012 12th International Conference on ITS Telecommunications, ITST 2012 - Taipei, Taiwan
Duration: 5 Nov 20128 Nov 2012

Publication series

Name2012 12th International Conference on ITS Telecommunications, ITST 2012

Conference

Conference2012 12th International Conference on ITS Telecommunications, ITST 2012
Country/TerritoryTaiwan
CityTaipei
Period5/11/128/11/12

Keywords

  • Bayesian inference
  • Histogram of Oriented Gradients
  • Parking space detection
  • Surface-based detection

Fingerprint

Dive into the research topics of 'A surface-based vacant space detection for an intelligent parking lot'. Together they form a unique fingerprint.

Cite this