您的位置: 站长主页 -> 繁星客栈 -> 望月殿 -> 马尔科夫过程(请教) | March 24, 2013 |
马尔科夫过程(请教)
论坛嘉宾: 萍踪浪迹 gauge 季候风 |
一剑断浪 发表文章数: 116 内力值: 153/153 贡献度: 502 人气: 284 |
马尔科夫过程(请教) [文章类型: 原创]
知道这里高手如云,想请教些问题!
如题是关于马尔科夫过程的! 1.通俗地讲马尔科夫过程是研究什么的? 2.马尔科夫过程在金融中的应用 3.学习马尔科夫需要的知识储备 4.马尔科夫过程的发展远景 5.学习马尔科夫过程需要什么样的计算机技能(即学习什么数学工具) 6.学习马尔科夫过程以后就业的方向是什么? 目前我就想到了这些问题!因为我在报考研究生的时候刚刚听说这么个专业! 很多东西都不知道,请大家帮帮忙了! 由于本人水平有限,最好能用通俗的语言告诉我以上问题的讲解! 谢谢大家!
爱在孤独中绝望,在绝望中坚强!
| ||
Tom 发表文章数: 55 内力值: 218/218 贡献度: 81 人气: 42 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
第5个问题:估计得学Matlib
| ||
Tom 发表文章数: 55 内力值: 218/218 贡献度: 81 人气: 42 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
去银行,当经济学家,或编炒股软件等。
| ||
kanex 发表文章数: 447 内力值: 254/254 贡献度: 2295 人气: 516 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
随机过程。如果全部掌握了XXX的那本,wall street等着你。
so, it has good future.
Récoltes et semailles
| ||
Omni 发表文章数: 280 内力值: 263/263 贡献度: 4868 人气: 688 |
Re: 马尔科夫过程 [文章类型: 原创]
1.通俗地讲马尔科夫过程是研究什么的?
A Markov process is a special type of stochastic processes. The study of stochastic processes is usually a part of university courses on probability theory. For example, I know Fudan has edited 3 volumes of the textbook "Probability Theory" (高等教育出版社): Volume 1: Fundamentals of Probability Theory; Volume 2: Mathematical Statistics; Volume 3: Stochastic Processes. But after reading many excellent textbooks in English, I threw all 3 volumes away years ago. If you want to learn some basics on stochastic processes, I recommend two English textbooks (official photocopies should be available in Chinese bookstores): - Sheldon M. Ross, "Introduction to Probability Models" (6th edition, 1997) - Edward Kao, "An Introduction to Stochastic Processes" (1997) Briefly, there are two types of Markov processes: (1) Discrete-time Markov Process, which is more commonly known as "Markov Chain". This type is more useful than the second one. (2) Continuous-time Markov Process. 2.马尔科夫过程在金融中的应用 Markov chains are highly useful in econometrics and finance because you have a lot of time-series data to model. Actually the 2003 Nobel prize in economics was awarded to the research field known as "time series econometrics": http://nobelprize.org/nobel_prizes/economics/laureates/2003/index.html 3.学习马尔科夫需要的知识储备 You have to know some fundamentals of probability theory. The encounter of Markov matrices is a very interesting topic in introductory linear algebra. 4.马尔科夫过程的发展远景 The field of Markov processes by itself is already very mature, not too many research questions left from a pure mathematical point of view. The future development of this field should focus on its application to many other scientific disciplines. So Markov processes as a part of applied mathematics has a bright future, Wikipedia summarizes the applications very well: http://en.wikipedia.org/wiki/Markov_chain Scientific applications Markovian systems appear extensively in physics, particularly statistical mechanics, whenever probabilities are used to represent unknown or unmodelled details of the system, if it can be assumed that the dynamics are time-invariant, and that no relevant history need be considered which is not already included in the state description. Markov chains can also be used to model various processes in queueing theory and statistics. Claude Shannon's famous 1948 paper A mathematical theory of communication, which at a single step created the field of information theory, opens by introducing the concept of entropy through Markov modeling of the English language. Such idealised models can capture many of the statistical regularities of systems. Even without describing the full structure of the system perfectly, such signal models can make possible very effective data compression through entropy coding techniques such as arithmetic coding. They also allow effective state estimation and pattern recognition. The world's mobile telephone systems depend on the Viterbi algorithm for error-correction, while Hidden Markov models (where the Markov transition probabilities are initially unknown and must also be estimated from the data) are extensively used in speech recognition and also in bioinformatics, for instance for coding region/gene prediction. The PageRank of a webpage as used by Google is defined by a Markov chain. It is the probability to be at page i in the stationary distribution on the following Markov chain on all (known) webpages. If N is the number of known webpages, and a page i has ki links then it has transition probability (1-q)/ki + q/N for all pages that are linked to and q/N for all pages that are not linked to. The parameter q is taken to be about 0.15. Markov models have also been used to analyze web navigation behavior of users. A users web link transition on a particular website can be modeled using first or second order Markov models and can be used to make predictions regarding future navigation and to personalize the web page for an individual user. Markov chain methods have also become very important for generating sequences of random numbers to accurately reflect very complicated desired probability distributions - a process called Markov chain Monte Carlo or MCMC for short. In recent years this has revolutionised the practicability of Bayesian inference methods. Markov chains also have many applications in biological modelling, particularly population processes, which are useful in modelling processes that are (at least) analogous to biological populations. The Leslie matrix is one such example, though some of its entries are not probabilities (they may be greater than 1). A recent application of Markov chains is in geostatistics. That is, Markov chains are used in two to three dimensional stochastic simulations of discrete variables conditional on observed data. Such an application is called "Markov chain geostatistics", similar with kriging geostatistics. The Markov chain geostatistics method is still in development. Markov chains can be used to model many games of chance. The children's games Chutes and Ladders and Candy Land, for example, are represented exactly by Markov chains. At each turn, the player starts in a given state (on a given square) and from there has fixed odds of moving to certain other states (squares). 5.学习马尔科夫过程需要什么样的计算机技能(即学习什么数学工具) Edward Kao's textbook requires the use of MatLab, which is a very powerful programming tool for stochastic process research. 6.学习马尔科夫过程以后就业的方向是什么? The answer should be clear from my reply to your question #4. Of course, many people may view the application of Markov chains in finance the most lucrative career direction.
| ||
duality 发表文章数: 26 内力值: 87/87 贡献度: 55 人气: 29 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
如果全部掌握了XXX的那本,wall street等着你
which book do you mean?
| ||
学佛学科学 本作者已经 离开客栈 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
有一段时间我喜欢上了投资学 还自学了一本教材(一千多页 但现在忘得差不多了) 至少认认真真看了5个投资高手的传记(和投资思想) 但现在只能在我的摘记中找到他们的资料了(呵呵 也被我忘记了) 我发现一个“规律”就是 其实那些一流的投资高手 是很少用那么复杂的数学工具的 有的说白了就是凭直觉
你去看看巴菲特 索罗斯 罗杰斯的投资 他们根本就不会去用“马尔科夫过程” 他们连微积分都懒得用 呵呵 这几年好像索罗斯在投资房地产 不知这会他赚不赚?
| ||
littlebird 发表文章数: 863 内力值: 310/310 贡献度: 3683 人气: 420 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
大家说得都很好,就是感觉比较抽象,因为偶是应用线上的,对马氏过程有些了解,还是偶来谈谈自己的理解:
马氏过程是基于非常简单的观念,就是"这一时段"的状态只和"前一时段"有关(概率关联),与再过去的状态没有关系,也就是说和再过去的情况不再"藕断丝连",嗯,这一假设确实是自然界有些过程的一种合理抽象. 比如说Omni兄每天的"情绪状态",情绪状态好就用"1"代表,不好就用"0"代表,这样一天天下来就可以构成"0"和"1"构成的随机序列.如果这一序列认为满足马氏过程,实际上是在说,今天的情绪状态只和昨天有关系. 这里要注意的是,明天和今天的关联是一种概率关联,而不是确定性的.既便你今天高兴,明天的情绪有会有高低.但是"高"和"低"的概率却是确定的(即称为转移概率),要由今天的状态预测明天的状态,静下来一想,就会发现有四个转移概率需要了解: 情绪高→低的概率; 情绪高→高的概率; 情绪低→高的概率; 情续低→低的概率. 这四个转移概率就构成了马氏过程的"转移概率矩阵". 那一剑断浪小妹肯定会说,废话少说,这四个转移概率如何得到? 因为我们一般研究的是"稳定的"马氏过程,看看书上,"稳定"的概念真是"令人头疼得要命"了,好,我们暂且不要去管那些"使人头大"的解释,就认为上述四个转移概率是不变的.即频率(概率的一种解释)是稳定的,那只要简单统计一下过去(历史样本)的这四种状态相互转移的频率就"大功告成"啦. 那么我们就可以用这些转移概率做一些有"小小的成就感"的预测工作了. 如果今天很高兴,那明天的状态如何?确定的状态上帝才晓得,人却可以做概率预报,明天心情好的概率是多少?心情不好的概率是多少?赶紧去查一下上面的转移概率表(学名"转移概率矩阵")吧! 如果要预测后天的心情状态,就要使转移概率矩阵"平方"一下,即得到后天与今天的转移状态概率联系! 这种方法也可以用于气象概率预报,首先假设历史上有雨日和无雨日的链序列满足马尔科夫过程(当然上升到学术,还要做以马氏过程检验), 然后统计出(无雨日→有雨日或者有雨日→有雨日)转变的相对频率, 再根据今天的状态来估计明天的降雨概率, 这就是一阶马尔科夫过程的气象预报方法。 但是一般实际的日降雨序列可能是高阶相关, 即今天的状态和过去几天都有关系(高阶马氏过程,一种变异推广的情况),因此还需作相关性检验, 判断其阶数, 然后用加权的方法来计算转移概率(降雨概率), 这样做就更为合理一些。这种方法所依赖的历史样本比较少, 简单易行;但准确性不大敢保证!
松下问童子,言师采药去.
只在此山中,云深不知处. 真是很惭愧,偶就是那位问路人
| ||
littlebird 发表文章数: 863 内力值: 310/310 贡献度: 3683 人气: 420 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
马氏过程是物理学上Monte carlo方法(别被它的名字吓坏,其实非常简单),它的变异很多,有"半马尔科夫过程","条件马尔科夫过程","隐马尔科夫过程",偶现在感兴趣的是"隐马尔科夫过程"
上面最后一句"准确度不敢保证"是说一阶马尔科夫过程准确度最不好,高阶马尔科夫过程还算可以!
松下问童子,言师采药去.
只在此山中,云深不知处. 真是很惭愧,偶就是那位问路人
| ||
sage 发表文章数: 359 内力值: 334/334 贡献度: 5130 人气: 237 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
::马氏过程是物理学上Monte carlo方法
No. They are quite different. Markov chain, from it's name, it is a chain. There is correlation between this step and the next step. Monte Carlo, in general, refers to random sampling (there are fancier ones than just random sampling, but the basic principle is the same).
| ||
Omni 发表文章数: 280 内力值: 263/263 贡献度: 4868 人气: 688 |
Re: 马尔科夫过程 [文章类型: 原创]
>>马氏过程是物理学上Monte Carlo方法(别被它的名字吓坏,其实非常简单),
This is a major misunderstanding. The Monte Carlo method is used for numerical simulations to compute highly complex mathematical problems, e.g., multi-dimensional integrals of probability distributions. There is one special type of Monte Carlo method using Markov Chains, this is the so-called MCMC: http://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo So you can't equate Markov Chains to Monte Carlo methods. The word "Monte Carlo" simply means the involvement of a random-number generator for any kind of simulation-based sampling. On the other hand, the name "Markov" simply refers to any probability models showing Markovian properties as you mentioned in your previous reply. >> 它的变异很多,有"半马尔科夫过程","条件马尔科夫过程","隐马尔科夫过程",偶现在感兴趣的是"隐马尔科夫过程" Hidden Markov Models (HMMs) are highly useful in speech recognition and recently also in computational biology. "Semi-HMM" is a generalization of "HMM". An HMM can be considered as the simplest dynamic Bayesian network. >> 上面最后一句"准确度不敢保证"是说一阶马尔科夫过程准确度最不好,高阶马尔科夫过程还算可以! Ideally, it's always desirable to go higher order, but the problem is you will need a lot more data to train a high-order Markov chain than a first-order Markov chain. For HMM, it's almost impossible to gather enough data to train second- or higher- order HMMs in real applications. Amazingly, first-order HMM works very well in speech recognition and bioinformatics. There is no such thing called "准确度" in statistical modeling, model performance can only be evaluated by real-life applications. Many scientists worried about the Markov dependence is such an oversimplication in biological sequence analysis, but amazingly, real practices demonstrated the robustness of HMM. This reminds me of the story of linear programming, Dantzig himself was completely surprised by the fact that his Simplex algorithm worked so well in practice. He admitted that his geometric intuition turned out to be meaningless in high-dimensional spaces. I would even argue a first-order HMM may be better than a fifth-order Markov chain. There appears to be many advantages for us to assume the transition probabilities to be unknown and estimate these parameters from data. So the "H" in HMM turns out to be important for its biological applications.
| ||
gauge 发表文章数: 596 内力值: 375/375 贡献度: 8310 人气: 1396 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
我对于概率方面不太熟悉,谈一点个人的看法。首先,Monte Carlo通常只是对现实世界的一个数学模拟,静态的。而马氏过程则不同,研究的是动态的变化的过程。有人用数论在数学中的地位来比喻马氏过程在概率论中的地位,这是相当高的。中国马氏过程的研究水平很高。其中北师大就是一个主要的据点。北师大数学的两个院士都是概率的,王梓坤和陈木法。
Monte Carlo方法和马氏过程也有一些关系。可参考陈木法的一篇文章,给个连接,有兴趣的可以自己看,http://math.bnu.edu.cn/~chenmf/,科普作品《随机系统的数学问题》的第二部分“概率与随机算法和算法复杂性”。
| ||
一剑断浪 发表文章数: 116 内力值: 153/153 贡献度: 502 人气: 284 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
谢谢大家的帮忙了!
omin兄你说的 Fudan has edited 3 volumes of the textbook "Probability Theory" (高等教育出版社):这本书我没找到!具体地说我找到了复旦大学出版的"Probability Theory" !但是现在有关于概率论的书太多了我也不知道是不是你说的那本!我看了一下,那本书不是很适合我! 我现在学的是基础概率论与数理统计!只是想边上着课,边学些其它知识!我找到一本王梓坤《概率论基础及其应用》。 - Sheldon M. Ross, "Introduction to Probability Models" (6th edition, 1997) - Edward Kao, "An Introduction to Stochastic Processes" (1997) 这两本书我这也没有!不过有机会我会去找找!不过希望内容不要太那难哦! 呵呵,对了你那有电子版的么? 我今天在图书馆看见了一本《测度与概率》。我在查研究生考试信息的时候,有看到北师大有对于测度的学习要求!而我们没有开这门课,还好那本书可以在不学习实变函数前提下学习!这也是使我今天很开心的原因啦!
爱在孤独中绝望,在绝望中坚强!
| ||
一剑断浪 发表文章数: 116 内力值: 153/153 贡献度: 502 人气: 284 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
北师大数学的两个院士都是概率的,王梓坤和陈木法
===================================================== 对阿!他们在马尔科夫过程的研究上都有很大的成就! 所以说,我现在是绝对的挑战自己呢!(挑战考那方面的研究生啊) 和大多数人一样,我最担心的是英语阿!
爱在孤独中绝望,在绝望中坚强!
| ||
Omni 发表文章数: 280 内力值: 263/263 贡献度: 4868 人气: 688 |
Re: 马尔科夫过程 [文章类型: 原创]
>>Monte Carlo方法和马氏过程也有一些关系。可参考陈木法的一篇文章,给个连接,有兴趣的可以自己看,http://math.bnu.edu.cn/~chenmf/,科普作品《随机系统的数学问题》的第二部分“概率与随机算法和算法复杂性”。
Prof. Chen's article is well written overall, it should give 一剑断浪 a nice overview. But you need to be cautious when reading this kind of pop science articles, it's always a good idea to have some reasonable doubt and confirm by doing your homework with Google. Not to take anything away from Prof. Chen (his accomplishments are quite respectable although one of his mentors Prof. Hou Zhen-Ting is more famous in the field of Markov processes), it's unfortunate for me to spot a major error on a very quick skim of this article. If I have time to read this article carefully this weekend, I may be able to find more errors of this type. On page 3, Prof. Chen equates simulated annealing to MCMC, this is a similar type of misunderstanding to Littlebird's confusion of the two different "MC" acronyms. Although many MCMC methods often need to go with an simulated annealing procedure, it's important not to confuse the former with the latter. Simulated annealing is a nonlinear optimization algorithm (as famous as its close competitor "genetic algorithm") invented by Kirkpatrick et al. in 1983. This method was developed to overcome the convergence problems encountered in many MCMC algorithms (e.g., the Metropolis-Hastings algorithm). To quote from Radford Neal's review article: "Markov chain sampling methods of all types may fail to converge to their equilibrium distribution within a reasonable time if movement through state space is inhibited by regions of high energy (low probability). The simulation may instead remain in a region of relatively high energy and/or small volume, because movement to a region of lower energy and/or larger volume can take place only via passage through states of very high energy, an event that is unlikely to occur in a simulation run of limited length. This problem can sometimes be overcome using simulated annealing." Note the word "sometimes" in the paragraph since there is no guarantee for convergence in any kind of nonlinear optimization problems (with the most famous one being the Traveling Salesman Problem). So simulated annealing is merely one algorithm developed to help the MCMC methods achieve convergence in some cases. I'm very surprised to see Prof. Chen's confusion as an expert in the field of stochastic processes.
| ||
Omni 发表文章数: 280 内力值: 263/263 贡献度: 4868 人气: 688 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
>> 这本书我没找到!... 我现在学的是基础概率论与数理统计!只是想边上着课,边学些其它知识!我找到一本王梓坤《概率论基础及其应用》。
A Google search with "复旦+概率论+高等教育出版社" found the following information: http://pgs.ruc.edu.cn/pages/zscksm/2006/2006sm_ndss/160.htm 《概率论》 复旦大学数学系编,高等教育出版社,1991年; But as I said, I didn't like this 3-volume book set a lot and threw them away. I never read Prof. Wang Zikun's book, but he is a world-famous mathematician in this area, I guess his book should be much more readable than the one by Fudan. Just forget about the Fudan books if you can't find them. >> 对了你那有电子版的么? I only have hard copies of both books. I bought Kao's book in a Shanghai bookstore in 2004 and my copy of the Ross book is a photocopy from my friend's xerox version. You should search for both books on eMule.
| ||
littlebird 发表文章数: 863 内力值: 310/310 贡献度: 3683 人气: 420 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
前两天发上面两贴后,也发现了上面的错误,就想再补上几个字,几字之差,谬以千里,呵呵:)
但没功力值了-论坛很令人遗憾的rule,只好另外写了一贴修正一下(放在"文件存放处") "马氏过程是物理学上Monte carlo方法 -------------- 马氏过程是物理学上Monte carlo方法的基础" Monte carlo方法偶是做过的,也很常用:) 非常谢谢热情真诚的omni兄对隐马尔科夫过程的诠释:),也非常感谢sage兄,gauge兄的回复! 也许以后偶会专开一贴来和大家讨论交流此问题(包括随机过程中比较深的问题)! 另外一剑断浪小妹,偶主张你在看"随机过程理论"中,把概率论的基础打好一些: 我以前看过一本"统计思想",写得非常不错,是两位老外写的.有中文翻译的(目前国内书店也有一本同名的英文书,但不是这一本) 国内陈希儒写的这方面书也是非常独特,这gauge兄已经向大家推荐过,偶再次重提一下:)
松下问童子,言师采药去.
只在此山中,云深不知处. 真是很惭愧,偶就是那位问路人
| ||
fineall 发表文章数: 36 内力值: 108/108 贡献度: 134 人气: 60 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
littlebird,"也许"这个词太概率了, 很希望能确定的写一篇如同前面的分析"线代"的贴. 介绍下对概率方法的理解, 我听说某人说过一句话,说是数学的全部秘密就在概率和某某的夹缝中.
非常希望能看到你或者是前面的朋友写一篇基于大学本科中的概率知识的补充文章(比如以自己的理解来阐述同济版概率书中许多唐突概念的几何直观). 谢谢! 期待好文章. ------------------------------------------------------------------------------ 也许以后偶会专开一贴来和大家讨论交流此问题(包括随机过程中比较深的问题)!
| ||
littlebird 发表文章数: 863 内力值: 310/310 贡献度: 3683 人气: 420 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
偶必须要等功力值蓄积后,才能来这里发贴,对不起,延迟回复了:)
首先偶向楼上的朋友推荐两位概率统计方面的“master":他们是omni兄,gauge兄。写概率统计的文章,他们一定会比偶写得好。他们写偶就写,呵阿:)。gauge兄因为现在兴趣转移,所以写此文章非omni兄莫属。而且他写东西很迅速。 你肯定没有注意到:我几乎是每天晚上十二点后来论坛看看,事情头绪太忙,爱人下月要临产:(,写恐怕也只能在春节后交稿(因为这里高手多,一般写会被大家看轻,呵呵),远水救不了近火。其实,《统计思想》这本书很不错的,但是不容易找,找到了读几遍,你的理解会大大进一步的:) 概率统计随机过程,不如拓朴学和微分几何那样“引人注目”,但是其应用面之广泛,真的令人咋舌! 仅此:)
松下问童子,言师采药去.
只在此山中,云深不知处. 真是很惭愧,偶就是那位问路人
| ||
gauge 发表文章数: 596 内力值: 375/375 贡献度: 8310 人气: 1396 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
首先偶向楼上的朋友推荐两位概率统计方面的“master":他们是omni兄,gauge兄。写概率统计的文章,他们一定会比偶写得好。他们写偶就写,呵阿:)。gauge兄因为现在兴趣转移,所以写此文章非omni兄莫属。而且他写东西很迅速。
============================ 看来非得声明一下了:本人不懂随机过程。所以不存在兴趣是否转移这一说。不过我认为Omni兄肯定熟悉这些。
| ||
fineall 发表文章数: 36 内力值: 108/108 贡献度: 134 人气: 60 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
上面两位朋友过谦,来论坛不久,请这里的各位朋友多多指教,能有幸在学习中得到更多的快乐。Omni兄,不揣冒昧,如果愿意写点心得,也一定很感谢。 : )
更希望是有人能提出心得,大家共同补充和讨论的方式,可能这样的要求有点过分,但真的希望在概率上有更几何直观的领悟,不然对许多突然出现的概念不好接受,一般教科书也不会介绍方法的缘起。 谢谢。 fineall礼上。
| ||
一剑断浪 发表文章数: 116 内力值: 153/153 贡献度: 502 人气: 284 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
十分感谢大家!
这两天我都有来看帖子,只是没有回复! ============================================================= 另外一剑断浪小妹,偶主张你在看"随机过程理论"中,把概率论的基础打好一些: ============================================================= 我们现在正学着概率论的基础课程呢 ! 我会好好学习的!
爱在孤独中绝望,在绝望中坚强!
| ||
一剑断浪 发表文章数: 116 内力值: 153/153 贡献度: 502 人气: 284 |
Re: 马尔科夫过程(请教) [文章类型: 原创]
真的谢谢大家了 !
我要特别的感谢omni兄! 你帮了我很多忙,不管是在以前还是现在! 呵呵! 你给我的回复我都打印下来了 ! 由于本人英语水平有限,得回去好好翻译! 所以回复的慢! 不过我觉得这样让我学习专业英语有很大的兴趣~! 再次感谢 ! ================== 楼上的朋友们要幸福哦!
爱在孤独中绝望,在绝望中坚强!
|
No comments:
Post a Comment