Chia Nan University of Pharmacy & Science Institutional Repository:Item 310902800/30868
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 18074/20272 (89%)
Visitors : 4293317      Online Users : 1568
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://ir.cnu.edu.tw/handle/310902800/30868


    Title: 危害標示圖像對危害認知之探討
    A Study on the Relations of Hazard Symbols and Images for Hazard Cognition
    Authors: 李淑芬
    Contributors: 職業安全衛生系
    鄭世岳
    Keywords: 危害標示
    認知
    辨識率
    Hazard Symbol
    Cognition
    Identification rate
    Date: 2017
    Issue Date: 2018-01-11 11:44:49 (UTC+8)
    Abstract: 圖像和文字都是一種視覺語言,圖像如何越過不同語言的文字障礙,將潛在危害與危害後果的警訊傳達顯現給視覺接收者,使之產生正確辨識、認知與做出安全決策反應,乃危害預防重要之議題。 本研究目的在於探討圖像與文字意義之關聯度、人員對圖像之辨識率、與人員對危害標示圖像之認知差異。係利用問卷調查法,引用美國國家標準協會ANSI Z535標準之符號進行測試,第一階段問卷利用李克特尺度法進行量度,第二階段問卷採用混淆配對法評量受測者對圖像之辨識率,分析方法包括描述性統計、T檢定、單因子變異數分析等統計方法。 研究顯示,受測者具高教育程度、男性、幹部和行政人員,在危害標示圖像上都有較高的辨識和認知。ANSI Z535雖為美國所設計之符號,卻仍能為不同地區之人員所使用,辨識率高且不因文化、語言風俗不同而受限。高辨識率具備「實心圖」、「簡圖」,具明確、單一訊息,並顯示危害後果、圖像呈現完整性、一體性、封閉性、單純性、聯想度高、易學習、概念相容、具可辨識的元件,且包含關鍵細節或限定要素等之設計元素。但在選用或設計時仍需對於使用群體,事前考量其辨識和認知能力,並透過效標考驗的方式來篩選出符合聯想的一致性之圖像,提供危害標示設計與應用之參考。
    Both images and words are elements of visual language. It is considered as an important issue of hazard prevention by analyzing how images are able to cross the limits of words in different languages and to transmit the potential hazard as well as warnings of hazardous consequences to the visual recipients, who may have reactions of accurate recognitions, perceptions, and safety decision making. This research is to explore the relation between images and meaning of words, people’s identification rate of images, and people’s cognitive differences of Hazard symbols and images. The research method is to apply questionnaire survey, and to proceed the test by applying American National Standards Institute (ANSI) Z535 standard signs. Likert Scale is adopted to measure in the questionnaire for the 1st stage. The questionnaire is designed by mixed-matching method on the 2nd stage in order to measure the respondents’ identification rate of images. After the questionnaires are completed by respondents, the analysis methods such as descriptive statistics, T-test, one-way analysis of variance and other statistics methods are adopted for analysis. The research indicates that respondents with higher education degree, male gender, as supervisors, and as administrative personnel are with higher identification as well as cognition towards images of hazard labels. Although ANSI Z535 are signs designed by America, they can be applied by people from different regions and the high identification rate presents the standard signs of ANSI Z535 are not limited by various cultures, languages, or customs. High identification rate not only includes “solid image” and “schematic image” with accurate and consistent messages, but also indicates the hazardous consequences, and the images that present integrity, unity, encapsulation, simplex, high association, easiness for learning, compatible concepts, and recognizable elements; and additionally, it also contains the design elements such as key-details or limited factors. In order to provide references for designs and application of Hazard symbols, pre-considering on the identification rate and cognitive ability of the groups of users and then choosing the images that are compatible with suggestive consistency by criterion test shall be conducted before selection or design.
    Relation: 電子全文公開日期:2019-06-30,學年度:105,98頁
    Appears in Collections:[Dept. of Occupational Safety] Dissertations and Theses

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML1222View/Open


    All items in CNU IR are protected by copyright, with all rights reserved.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback