Rescom TP instructions : Différence entre versions
De MARMOTE
(→1. Preparation) |
(→1. Preparation) |
||
Ligne 24 : | Ligne 24 : | ||
* click on "Machine > Add" | * click on "Machine > Add" | ||
* enter the location of the virtual machine that has been downloaded | * enter the location of the virtual machine that has been downloaded | ||
+ | * select the VM (Rescom2019_TP) in the right-hand pane and click on "Start" | ||
+ | * log in with username/password pierre/Rescom2019* | ||
+ | * you should see a desktop with two folders: TP_Marmote and TP_MDP | ||
+ | |||
+ | == 2. Instructions for building Markov Chains == | ||
+ | |||
+ | * Click on the TP_Marmote folder | ||
+ | * click on the file "example1.cpp" (or right-click then select "geany") | ||
+ | * a command-line terminal should appear at the bottom | ||
+ | <code> toto </code> |
Version du 27 juin 2019 à 19:20
Instuctions for the Lab on Markov chain modeling and MDP analysis, at the RESCOM2019 summer school
Objective
The goals of the Lab session is
- program the model of a discrete-time queue with impatience, batch size and finite capacity, using the marmoteCore library;
- program the same model with a control of admission in service, with the marmoteMDP library;
- compute the optimal service policy in this queue.
Steps
1. Preparation
The first step is to have the library installed on your computer. Two possibilities:
- using a virtual machine with virtualbox
- download VM + instructions from this page
- copy it from USB drive, available at the conference.
- using the compiled library (linux only)
- tarball + instructions from marmoteCore's site
The instructions with virtualbox are then:
- install the virtualbox software from its web site
- launch virtualbox
- click on "Machine > Add"
- enter the location of the virtual machine that has been downloaded
- select the VM (Rescom2019_TP) in the right-hand pane and click on "Start"
- log in with username/password pierre/Rescom2019*
- you should see a desktop with two folders: TP_Marmote and TP_MDP
2. Instructions for building Markov Chains
- Click on the TP_Marmote folder
- click on the file "example1.cpp" (or right-click then select "geany")
- a command-line terminal should appear at the bottom
toto