meaning of markoff chain
1.
a
Markov
process
for
which
the
parameter
is
discrete
time
values
Related Words
markoff chain
|
Developed & Maintained By
Taraprasad.com