English  |  正體中文  |  简体中文  |  Items with full text/Total items : 62822/95882 (66%)
Visitors : 4017377      Online Users : 592
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library & TKU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://tkuir.lib.tku.edu.tw/dspace/handle/987654321/51792


    Title: 科學邏輯與最大熵原理
    Other Titles: Logic of science and maximum entropy principle
    Authors: 吳俊安;Wu, Chun-an
    Contributors: 淡江大學物理學系碩士班
    曾文哲;Tzeng, Wen-jer
    Keywords: 貝氏定理;最大熵原理;Bayes theorem;Maximum entropy principle
    Date: 2010
    Issue Date: 2010-09-23 16:07:41 (UTC+8)
    Abstract: 當我們處理問題時,運用不同的分析方法會得到不同的結果。因此該運用何種方法做數據分析才符合科學邏輯?關於這個問題貝葉斯定理能夠給予我們一個滿意的答案;面對一問題時由已有的資訊(X)建立假說(H),再經由實驗取得數據(D)驗證假說是否成立。
    然而該如何從已知的資訊建立一合理的假說就需要透過信息熵與最大熵原理的協助。薛農(Claude Elwood Shannon April 30, 1916 – February 24, 2001 )於1948年提出了信息熵的概念來描述一事件本身的不確定度。而藉由探討一事件本身具有信息熵量來求取在固定條件下事件最有可能的演化行為便是最大熵原理。最大熵原理則明確的指出,在我們面對一問題時,猜測結果的機率分佈應滿足其熵達到最大值。
    傑尼斯( E. T. Jaynes July 5, 1922 –, April 30, 1998 )運用薛農所建立的信息熵來討論物理系統。如同我們對於最大熵原理的認知,傑尼斯認為在滿足所有限制下,具有最大信息熵值得一組分佈為我們所真實觀測到的分佈。這與熱力學第二定律中的描述非常類似。因此他直接認為在資訊理論中的信息熵與在物理系統中的熱力熵為同樣的物理量,因此我們可以藉由系統中熵的變化來討論統計力學的問題。
    我們可以透過拉格朗日變分法(Lagrange multipliers method)來求取在統計力學系統中的最大熵,並且透過建立分配函數(partition function)來得到變分方程的解。運用此方法來討論統計力學將更加清晰簡單,並且有機會在非平衡系統下尋找合理的解釋。
    The main question of this work is to introduce Logic and maximum entropy principle. Different person may have different opinion when they face to an event, that’s because everybody will think subjectively. We must to avoid this when we are discussing science.
    We will discuss Bayes theorem and Logic in the first chapter, for the classical probability opinion we will know why Bayes theorem can help us to discuss science. And in the second and third chapter we will introduce the measure of uncertainty-information entropy and maximum entropy principle. Because of the subjectivity of prior probability distribution of Bayes theorem, we need maximum entropy principle to help us to get objectivity prior.
    And at last chapter we discuss another way to get objectivity prior, what’s called the principle of transformation groups. When we face a problem, principle of transformation groups tell us to find the symmetrization of this problem, and we require this problem is invariant under the symmetric transform. It is very easy way to help us to find the objectivity prior when our problem has high symmetrization.
    Discuss logic can help us to recognize science easily. We wish the maximum entropy principle may have a great success in statistical mechanics in the future.
    Appears in Collections:[物理學系暨研究所] 學位論文

    Files in This Item:

    File SizeFormat
    index.html0KbHTML237View/Open

    All items in 機構典藏 are protected by copyright, with all rights reserved.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library & TKU Library IR teams. Copyright ©   - Feedback