Boltzmann entropy has been defined as a function of the number of possible microscopic states (i.e., microstates) for a given macroscopic state (i.e., macrostate) in a thermodynamic system. This finding further necessitates tackling the computational issues of Boltzmann entropy with images. The evaluation results revealed that none of these entropies and variants could satisfy all the criteria and thus serve as a measure of the spatial information of an image. A set of five criteria were proposed and corresponding testing images were generated. Therefore, this project started with a systematic experimental evaluation of the performance of these entropies and variants as measures of the spatial information of an image. In the development of information theory for various applications, a number of efforts have been made to improve Shannon entropy for characterizing spatial information, resulting in a variety of improved Shannon entropies and variants of Shannon entropy. Special attention has been paid to numerical raster data in general and images in particular, because such data represent the field-based data model that is common in many disciplines. This project aims to respond to these calls by tackling the computation and thermodynamic consistency issues of Boltzmann entropy with spatial datasets. Another reason behind these calls is the questioning of whether the Shannon entropy is thermodynamically relevant. To solve this problem, calls have recently been made for revisiting Boltzmann entropy, which was proposed by Ludwig Boltzmann in 1872 but still remains largely at a conceptual level. Both information is useful and should be characterized in many applications. The applicability of Shannon entropy to a spatial dataset (e.g., an image) has been severely limited because such a dataset contains not only compositional but also configurational information. As this entropy is computed based on a probability distribution of the components of a dataset, it is a measure of statistical information and depends only on the composition of the dataset. So far, the most popular and widely accepted measure is the entropy developed by Claude Shannon in 1948, usually referred to as Shannon entropy. In information theory, one of the most fundamental issues is the measurement of information content. Information theory originated in communications in the 1940s and has found applications in a broad range of disciplines such as biology, chemistry, ecology, neuroscience, and geoscience, leading to a series of new interdisciplinary fields including bio-informatics, chem-informatics, eco-informatics, neuro-informatics, and geo-informatics. Hong Kong Polytechnic University - DissertationsÄepartment of Land Surveying and Geo-Informatics Boltzmann entropy for spatial information of images
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |