As an anonymous user, you can only add new data. If you would like to also modify existing data, please create an account and indicate your languages on your user page.

Expression talk:Markov chain

From OmegaWiki
Jump to: navigation, search

The definition is clearly inadequate. The Wikipedia article says "A Markov chain describes at successive times the states of a system" but this is not its definition. It would be better: "discrete-time stochastic process with the Markov property" from the same source. This is an informal definition depending on definitions of other expressions but at least it is correct. It seems that in mathematics the term is used for different concepts having different degrees of generality. So probably it is necessary to have at least two more defined meanings. Andres 15:54, 14 November 2006 (CET)

The term is (maybe not Estonian but in several languages) ambiguous because 1) sometimes continuous-time stochastic processes are included; 2) sometimes processes with Markov properties of higher orders are included.

If I got it right, the set of possible states of the system is required to be countable in all cases. Andres 16:21, 14 November 2006 (CET)

When it is clearly wrong, I do not need to read any further right ? GerardM 16:24, 14 November 2006 (CET)
OK, I'm sorry. I'm hesitating about how to put it. And there is another problem about ambiguity (which doesn't hinder to decide the definition of this meaning). Probably at least three DM-s should be created at once in order to make it clear that the expression has several different meanings.
I would be glad to have second opinions. These notes are not meant personally for you but for anyone who is interested in this entry.
And for you I would like to say that it is not safe to take the titles of Wikipedia articles because sometimes they are in plural and sometimes they express related but not identical concepts. Andres 16:42, 14 November 2006 (CET)
It is indeed not always safe. Typically it goes well. I also do it for things that I expect are quite specific. The good news is that people CAN and DO change what is wrong. GerardM 16:47, 14 November 2006 (CET)
When you want to change the definition and THEN come up with ambiguity, I would not consider it an improvement. You have to appreciate that to understand fully what a Markov chain is, a definition is probably always insufficient. It is for the Wikipedia article to provide the appropriate wordy information. GerardM 16:50, 14 November 2006 (CET)
I strongly disagree, see below. First, in encyclopedia entries, you have not just definitions but substantial information beyond definitions. Second, in encyclopedia articles, several concepts may be covered in one article. A dictionary (remember that you promised a terminological dictionary) must include unambiguous definitions and distinguish between different concepts expressed by one and the same expression. If an expression is used for several concepts then we should have different DM-s, or else, the dictionary would be unreliable. In an encyclopedia it is possible to cover different but related concepts expressed by one expression in one article. It seems to me that the idea of this project does not allow to do that.
You are right that to understand fully what a Markov chain is requires more than just a definition. But it is absolutely necessary to distinguish the concept(s) of Markov chain from other related concepts, and this is what the definitions are supposed to do. Otherwise the dictionary is useless.
It takes a considerable amount of research to put all definitions and translations correct. I

cannot do it now.

What do you think, should we write extended definitions here? And should we address the definition to the mathematician or to the tyro? Or should we write several definitions (I think this is best)? Links to Wikipedia articles are no solution bevause there will be no one-to-one correspondence between Wikipedia and WiktionaryZ entries, Wikipedia articles are not always available and their content is not under our control. I think we should be self-sufficient about definitions.
I think there should be a function of linking from definitions to DM-s. Often expressions in definitions are ambiguous, and besides, less known expressions need explanation. Andres 17:13, 14 November 2006 (CET)


I recommend [this glossary. Andres 18:08, 14 November 2006 (CET)

Though it's not fully reliable either. Andres 18:10, 14 November 2006 (CET)

I've written a better definition in French. (I am mathematician) Kipcool 18:09, 14 November 2006 (CET)
Yes, it's better now. I translated it to English, check if it's correct. But shouldn't it be specified that the set of the possible values of Xi is to be countable?
From Wikipedia it seems that sometimes and at least in some languages Markov chains can be meant more generally, including sequences satisfying just higher-order Markov properties, or including continuous-time Markov processes. Andres 18:23, 14 November 2006 (CET)
English looks correct to me, thanks.
From what I understand, the original Markov chains (used by Markov) used a finite states space, but then there was an extention for an infinite number of states, and I think Markov chain now refers to both. (but I'm not expert in Markov chains...)
I wouldn't put things like continuous-time Markov processes as an additionnal DM for Markov chains. Kipcool 21:15, 14 November 2006 (CET)
I mean not a necessarily finite but a possibly infinite but countable number of states, that is not as many states how many real numbers there are. As far as I got from the presentation of the Wikipedias, this seems to be a substantial part of the definition.
As to the other DM-s, the title should be "continous-time Markov process". At least in some languages (German), the translation of "Markov chain" can also mean "continous-time Markov process". And besides, at least in some language, the translation of "Markov chain" can mean also "Markov chain of any order" ("by default" it means "first-order Markov chain"; for higher orders, "the future" can depend on "the past" in some limited way). Even so, "Markov chain" in English must have the meaning of "Markov chain of any order" because otherwise the expression "second-order Markov chain" would be self-contradictory. I think, this DM should have the same title "Markov chain". Andres 21:28, 14 November 2006 (CET)
you're right, it's always a countable number of states. You can specify that in the definition if you like, but I'm not sure it's necessary.
German word for "continous-time Markov process" seems to be "zeitkontinuierliche Markov-Ketten". I think that maybe people omit the "zeitkontinuierliche" part when they are lazy, but they are not accurate when doing so, and that shouldn't become a definition in WZ. Kipcool 23:35, 14 November 2006 (CET)
I think that it's not laziness. As with many other mathematical terms, the same term is used for related concepts with different degrees of generality. This depends on the topic of the publication. Therefore all mathematical publication define almost all terms to avoid ambiguity. For instance "ring" in algebra can mean "ring without any requirement of associativity or commutativity", "associative ring", "associative and commutative ring", "associative and commutative ring with a unit element". The first definition is used in universal algebra, the last definition is usually used in commutative algebra. You cannot say that one of them is correct and the others are used because of laziness. I think that we need different DM-s for them (though they probably should have different titles).
In the five-volume Russian mathematical encyclopedia 8the biggest one there is as far as I know), Markov chains are defined in two different ways. The article "Markov chain" begins: "Markov process with a finite or countable set of states." In this article Markov chains can be time-continuous! The article "Markov process" is written by another author who writes: "A Markov process for which T [the set of time moments] is contained in the set of natural numbers is called a Markov chain (by the way, the last term mostly is associated with the case of an E [the state space] not greater than countable). If T is an interval in the set of real numbers and E is not greater than countable then the Markov process is called a time-continuous Markov chain."
It seems to me that we should collect all concepts associated with one term and have a separate DM for each concept with all expressions that can express this concept. Of course, the title should be chosen as characteristic and unambiguous as possible. I think this approach would make our dictionary really useful, more useful than any other dictionary.
People who don't know mathematics think that in mathematics all terms have fixed meanings. But in reality, in mathematics they have more unfixed meanings than elsewhere because mathematicians define their terms anew in each publication. Andres 07:41, 15 November 2006 (CET)
ok, I'm lost. Do what you think is right, so that I can have a better idea of what you mean (it's easier to think on examples). Thanks Kipcool 12:31, 15 November 2006 (CET)
OK but later. Andres 12:47, 15 November 2006 (CET)