Rescom TP instructions : Différence entre versions
De MARMOTE
(→2.1 Developing a new model) |
(→Steps) |
||
Ligne 10 : | Ligne 10 : | ||
== Steps == | == Steps == | ||
− | === | + | === Preparation === |
The first step is to have the library installed on your computer. Two possibilities: | The first step is to have the library installed on your computer. Two possibilities: | ||
Ligne 28 : | Ligne 28 : | ||
* you should see a desktop with two folders: TP_Marmote and TP_MDP | * you should see a desktop with two folders: TP_Marmote and TP_MDP | ||
− | === | + | === Instructions for building Markov Chains === |
− | ==== | + | ==== Testing the example provided ==== |
* Click on the TP_Marmote folder | * Click on the TP_Marmote folder | ||
* click on the file "example1.cpp" (or right-click then select "geany") | * click on the file "example1.cpp" (or right-click then select "geany") | ||
Ligne 62 : | Ligne 62 : | ||
* compile and execute | * compile and execute | ||
− | ==== | + | ==== Developing a new model ==== |
Version du 27 juin 2019 à 21:27
Instuctions for the Lab on Markov chain modeling and MDP analysis, at the RESCOM2019 summer school
Sommaire
Objective
The goals of the Lab session is
- program the model of a discrete-time queue with impatience, batch size and finite capacity, using the marmoteCore library;
- program the same model with a control of admission in service, with the marmoteMDP library;
- compute the optimal service policy in this queue.
Steps
Preparation
The first step is to have the library installed on your computer. Two possibilities:
- using a virtual machine with virtualbox
- download VM + instructions from this page
- copy it from USB drive, available at the conference.
- using the compiled library (linux only)
- tarball + instructions from marmoteCore's site
The instructions with virtualbox are then:
- install the virtualbox software from its web site
- launch virtualbox
- click on "Machine > Add"
- enter the location of the virtual machine that has been downloaded
- select the VM (Rescom2019_TP) in the right-hand pane and click on "Start"
- log in with username/password pierre/Rescom2019*
- you should see a desktop with two folders: TP_Marmote and TP_MDP
Instructions for building Markov Chains
Testing the example provided
- Click on the TP_Marmote folder
- click on the file "example1.cpp" (or right-click then select "geany")
- a command-line terminal should appear at the bottom. Type
./example1
Example 1: construction of a discrete-time Markov chain on a 3-state space.
- the program takes as arguments:
- n, a number of steps
- p1 p2 p3, three probabilities summing up to 1, representing the initial distribution
- it outputs
- the probability transition matrix
- a trajectory x[0], x[1], ... x[n]
- run the example with values, e.g.
./example1 4 0.2 0.3 0.5
- use the editor to modify the code example1.cpp, in order to make state 2 absorbing
- compile by clicking "Construire > Make"
- execute again
- modify further the code to make it compute the value of the distribution after n steps:
Distribution* trDis = c1->TransientDistributionDT( 0, n );
trDis->Write( stdout, STANDARD_PRINT_MODE );
- compile and execute