Policy decisions by governments commit considerable amounts of money to policies. Therefore, they need to secure intended results for the lowest public outlay and the satisfaction of the agricultural constituency. Therefore, policy analysis procedures need to be practical and effective, providing evidence-based justifications for policy decisions, including risk analysis. Under such circumstances the best approach to training is learning by doing to embed analytical procedures clearly in the mind of trainees so that they become part of the individual’s capabilities in the form of “know how do and why" as opposed to a more theoretical “awareness” of the procedure.
4P is an ideal instrument for training because it can be used to take attendees through the policy procedure design process using DRMs consisting of:
4P can be run:
By taking attendees through the “model building” process they gain a full understanding of what the procedure can accomplish as well as its limitations and when such a procedure should be applied in the policy management cycle. This models runs in a design studio and can be posted online with security access control. The models can be accessed at any time and used by any authorised person with a browser or thin client on a mobile, tablet or laptop.
Trainees can “play with the model” and carry out calculations or simulations. By leaving the system online training is not limited to training sessions. Trainees can reflect on the session content and when a need for clarification occurs, they can access the model to test specific circumstances at any time convenient to them. In this way they can evaluate the “robustness” of the procedure from perspectives they feel are relevant. This learning by doing “fixes” the process clearly in the mind of trainees and the procedural element becomes a part of the individual’s “know how do” and why, as opposed to a more theoretical “awareness” of the procedure.
Procedures are normally evaluated from three standpoints:
The model normally runs as a simulation using existing data and assumed probabilities of events such as bad weather or good weather. The output of a simulation is assessed in terms of likelihood of it being a good representation of expected output or, historic data can be used to test the model. Data used can be saved for use after. Outputs include graphic, tabular and reporting formats which can be printed off for review.