论文标题
二进制马尔可夫链的信息熵和温度
Information entropy and temperature of the binary Markov chains
论文作者
论文摘要
我们提出了两种不同的方法,用于引入二进制n-ther阶马尔可夫链的信息温度。第一种方法是基于在给定温度下比较马尔可夫序列与平衡ising链。第二种方法使用发生符号的有限长度子序列的概率,这些符号确定其熵。熵相对于能量的导数给出了以引入能量的规模测量的信息温度。对于最近的邻居自旋/符号相互作用,两种方法都提供了相似的结果。但是,基于N-Step Markov和Ising链的对应关系的方法对于N> 3来说似乎很麻烦。我们还介绍了弱相关的单参数马尔可夫链的信息温度,并为逐步和功率记忆功能提供了结果。给出了开发方法获得一些文学文本的信息温度的应用。
We propose two different approaches for introducing the information temperature of the binary N-th order Markov chains. The first approach is based on comparing the Markov sequences with the equilibrium Ising chains at given temperatures. The second approach uses probabilities of finite-length subsequences of symbols occurring, which determine their entropies. The derivative of the entropy with respect to the energy gives the information temperature measured on the scale of introduced energy. For the case of nearest-neighbor spin/symbol interaction, both approaches provide similar results. However, the method based on the correspondence of the N-step Markov and Ising chains appears to be very cumbersome for N>3. We also introduce the information temperature for the weakly correlated one-parametric Markov chains and present results for the step-wise and power memory functions. An application of the developed method to obtain the information temperature of some literary texts is given.