Date of Award:
Master of Science (MS)
Electrical and Computer Engineering
This thesis explores pattern recognition in the dynamic setting of public transportation, such as a bus, as people enter and later exit from a doorway. Matching the entrance and exit of each individual provides accurate information about individual riders such as how long a person is on a bus and which stops the person uses. At a higher level, matching exits to entries provides information about the distribution of traffic flow across the whole transportation system. A texel camera is implemented and multiple measures of people are made where the depth and color data are generated. A large number of features are generated and the sequential floating forward selection (SFFS) algorithm is used for selecting the optimized features. Criterion functions using marginal accuracy and maximization of minimum normalized Mahalanobis distance are designed and compared. Because of the particular case of the bus environment, which is a sequential estimation problem, a trellis optimization algorithm is designed based on a sequence of measurements from the texel camera. Since the number of states in the trellis grows exponentially with the number of people currently on the bus, a beam search pruning technique is employed to manage the computational and memory load. Experimental results using real texel camera measurements show good results for 68 people exiting from an initially full bus in a randomized order. In a bus route simulation where a true traffic flow distribution is used to randomly draw entry and exit events for simulated riders, the proposed sequential estimation algorithm produces an estimated traffic flow distribution which provides an excellent match to the true distribution.
Wang, Ziang, "People Matching for Transportation Planning Using Optimized Features and Texel Camera Data for Sequential Estimation" (2012). All Graduate Theses and Dissertations. 1298.
Copyright for this work is retained by the student. If you have any questions regarding the inclusion of this work in the Digital Commons, please email us at .