English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 2928/5721 (51%)
造訪人次 : 374044      線上人數 : 631
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncut.edu.tw/handle/987654321/3499


    題名: Overlay Text Detection in Complex Video Background
    作者: Chun-Cheng Lin
    Bo-Min Yen
    貢獻者: Department of Electrical Engineering National Chin-Yi University of Technology
    關鍵詞: Transition map
    Overlay text detection
    Video subtitles
    Local binary pattern
    Texture analysis
    日期: 2010-06
    上傳時間: 2010-09-15 09:46:56 (UTC+8)
    摘要: The subtitles on video are very useful for us to understand video's contents. If we can extract text information from the subtitles, it will be very helpful to establish a database of video’s content that includes annotations and indexes. To extract text
    from videos, most text detection and extraction methods use the text color, background contrast, and texture information. However, the limitations of the
    existing methods are various contrasts and complex backgrounds in the text processing. The purpose of this study is to develop a method to detect overlay text in the complex video background. This study utilized the characteristics that there are transient
    color changes existing between text and its adjacent background, to generate transition maps. And this work applied the connected components to smooth the connected rectangular shapes. The study results
    demonstrated that the text detection method based on the transition map can effectively detect the text regions under different text contrasts, fonts and background complexities.
    關聯: 第五屆智慧生活科技研討會論文集(上)
    顯示於類別:[資訊工程系(所)] 第五屆智慧生活科技研討會

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    Overlay Text Detection in Complex Video Background.pdf145KbAdobe PDF3381檢視/開啟


    在NCUTIR中所有的資料項目都受到原著作權保護.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回饋