Date of Award:

5-2013

Document Type:

Thesis

Degree Name:

Master of Science (MS)

Department:

Computer Science

Committee Chair(s)

Vladimir A. Kulyukin

Committee

Vladimir A. Kulyukin

Committee

Curtis Dyreson

Committee

Nick Flann

Abstract

The Computer Science Assistive Technology Laboratory (CSATL) at Utah State University has a long history of research in visually impaired grocery shopping tech. CSATL’s ShopMobile II introduced nutrition facts table (NFT) analysis but only with perfectly aligned and square input images.

A new method which detects and localizes NFTs more quickly and from rotated or non-square images has been released and is slated for integration with ShopMobile II to improve this feature substantially. This is great news for the estimated 3.6 million adults in the United States having visual impairment or blindness and also opens the doors to other applications where analysing NFTs can significantly aid users.

By combining image analysis methods in creative ways, the new method avoids detecting NFTs where there are none and properly locates them in about 42% of images. This is remarkable considering images will be processed as quickly as possible on the device – a standard Android smartphone. The CSATL isn’t stopping here though: the new method exposes several possibilities for further improvements.

Checksum

48fc863e655242708a4596940d4633c6

Share

COinS