学位论文详细信息
Crowdsourcing construction activity analysis from jobsite video streams
Construction Productivity;Activity Analysis;Workface Assessment;Video-based Monitoring;Crowdsourcing
Liu, Kaijian ; Golparvar-Fard ; Mani
关键词: Construction Productivity;    Activity Analysis;    Workface Assessment;    Video-based Monitoring;    Crowdsourcing;   
Others  :  https://www.ideals.illinois.edu/bitstream/handle/2142/50402/Kaijian_Liu.pdf?sequence=1&isAllowed=y
美国|英语
来源: The Illinois Digital Environment for Access to Learning and Scholarship
PDF
【 摘 要 】

The advent of affordable cameras is reshaping the way owners and contractors are documenting ongoing construction operations by providing large amount of jobsite video streams for assessing on-site activities. To facilitate assessing these large collections of videos, recent studies have focused on leveraging computer vision algorithms for construction activity analysis and automatic workface assessment process. Despite promising results, understanding human activities from videos are still rather limited. The main gaps in knowledge are: 1) the lack of comprehensive datasets with groundtruth to cover all construction activities, and 2) methods that can deal with high degree of intra-class variability among activities and visual feature similarities among non-direct works.To address the need of reliable workface assessment, and to facilitate the development of computer vision algorithms for automatic activity analysis, this thesis proposes conducting video-based workface assessment in form of a crowdsourcing task at massive marketplace, such as Amazon Mechanical Turk (AMT). The presented method can attract hundreds of human annotators from around the world in seconds to use the compositional structure of worker-activity-posture-tool to analyze videos. Today, the human ability to interpret video content outperforms current vision-based algorithms. Thus, it is hypothesized that with the assistance from crowd intelligence (non-experts) and automatic detection and tracking algorithms, reliable workface assessment results can be quickly collected from jobsite video streams.To validate this hypothesis, a web-based workface assessment tool is developed that supports 1) crowdsourcing of video-based activity analysis tasks at AMT by calling for human annotators’ intelligence to interpret a sparse set of keyframes; 2) a detection and a tracking algorithm to automatically generate workface assessment results for the rest of non-key frames based on the sparse set of user-assisted key frames; and 3) intuitive interfaces for 2D construction resources localization, presentation of compositional structure taxonomy of construction worker activities, visualization of activity analysis results, and quality control strategies. Six different exhaustive experiments are conducted to examine different annotation methods and frequencies, different video lengths to construct HITs (Human Intelligence Tasks), difference between expert and non-expert annotators, difference between linear and detection-based extrapolation methods, and optimal cross-validation strategy to improve workface assessment’s accuracy. The experimental results are presented and discussed by annotation time and accuracy at each level of compositional structure. Our experiments with overall accuracy of 85% for non-expert annotators testify that the quality of work by non-experts annotators at AMT is as reliable as experts on providing accurate and complete workforce assessment. The introduced method has potential to minimize time needed for workface assessment and allows professionals to focus their time on the more important task of root-cause analysis and investigating alternatives for performance improvement.

【 预 览 】
附件列表
Files Size Format View
Crowdsourcing construction activity analysis from jobsite video streams 14841KB PDF download
  文献评价指标  
  下载次数:2次 浏览次数:16次