Book 3. Operational Risk
FRM Part 2
OR 15. Supervisory Guidance on Model Risk Management

Presented by: Sudhanshu
Module 1. Model Risk Management
Module 2. Model Validation Process
Module 1. Model Risk Management
Topic 1. Model Risk – Definition and Sources
Topic 2. Effective Model Risk Management Elements
Topic 3. Best Practices for Model Development and Implementation
Topic 1. Model Risk – Definition and Sources
-
Model Definition & Components
- Models combine statistics, economics, finance, and mathematics with assumptions to transform input data into quantitative outputs
- Three distinct parts: (1) information inputs to deal with data and assumptions, (2) processing to convert inputs to estimates, (3) reporting to convert estimates into applied information
- Models attempt to mimic real life but use simplifying assumptions due to cost-benefit constraints
- Models will never be completely accurate and model risk will always be present
-
Model Risk Definition
- Model risk creates possibility of negative outcomes from poor decisions based on inaccurate model outputs
- Outcomes include financial and reputational losses
-
Two Sources of Model Risk
- Significant model errors producing faulty outputs: Errors can occur from initial design to final implementation; simplifying modifications may sacrifice accuracy; "garbage in, garbage out" principle applies to input and assumption quality
- Improper model usage: Models used incorrectly, not for intended purpose, or out of context; previously appropriate models may become inappropriate for new products or changed market environments; crucial for users to understand original model limitations and purposes
Practice Questions: Q1
Q1. Which component of a model deals with converting estimates into useful or applied information?
A. Information inputs.
B. Processing.
C. Reporting.
D. Transformational.
Practice Questions: Q1 Answer
Explanation: C is correct.
The reporting component essentially transforms estimates into useful business information. In contrast, the processing components transforms inputs into estimates.
Topic 2. Effective Model Risk Management Elements
-
Risk Identification & Quantification
- Model risk depends on complexity, input/assumption uncertainty, and potential impact on users
- Risk analysis must cover both stand-alone models and aggregate exposure from interconnected models
-
Effective Challenge Framework
- Requires in-depth, third-party evaluation by technically skilled individuals independent from model development
- Challengers must identify model weaknesses and assumptions while proposing effective solutions
- Robust follow-through process needed with upper management support and authority
-
Additional Risk Management Methods
- Set restrictions on model usage and analyze ongoing model performance
- Continuously calibrate and improve models based on performance data
- Place model outputs in context of other relevant information as reasonableness checks
-
Risk-Based Approach
- Level of risk management should align with model's potential impact (materiality)
- Models with significant impact on firm's profits/financial position require more detailed and comprehensive risk management processes
Practice Questions: Q2
Q2. Which of the following statements regarding model risk is correct?
A. Shortcuts and simplifications will increase model risk.
B. Managing model risk requires proper segregation of duties.
C. With the appropriate procedures and tools, model risk can be eliminated.
D. Like many other risks, model risk has both an upside and a downside component.
Practice Questions: Q2 Answer
Explanation: B is correct.
A proper “effective challenge” of models is a key part of managing model risk and would require proper segregation of duties. In that regard, the model development process and the critical review of the model must be done by different parties in order to maintain proper independence and objectivity.
Topic 3. Best Practices for Model Development and Implementation
-
Clear Objective & Foundation: Model development must begin with a clearly stated objective aligned with intended use, supported by thorough documentation of background information, strengths, weaknesses, and technically-correct mathematical theories
-
Data Quality & Documentation: Ensure robust and relevant data and assumptions, with clear documentation of any data proxies, non-representative data, or adjustments from external sources to enable user assessment
-
Comprehensive Testing: Conduct rigorous testing to assess model precision, strength, and consistency across reasonable input ranges, identifying potential weaknesses and failure conditions
-
Stress Testing Requirements: Test models in both normal and extreme (stressed) market conditions, including unusual but plausible scenarios to validate performance under various circumstances
-
Model Interconnection Analysis: When model outputs serve as inputs for other models, analyze the downstream models and document the interconnected relationships and dependencies
-
Multiple Testing Methods: Use multiple tests rather than relying on single tests to avoid Type I and Type II errors, particularly important in quantitative contexts with sampling
-
Qualitative Integration & Controls: Balance quantitative outputs with subjective/non-quantitative elements when relevant, ensuring logical basis and clear documentation, supported by robust internal control systems
Practice Questions: Q3
Q3. Which of the following items is least likely to be a key consideration when testing models?
A. Testing for potential weaknesses.
B. Testing with extreme values as inputs.
C. Testing under normal market conditions.
D. Testing other models that rely on the subject model.
Practice Questions: Q3 Answer
Explanation: C is correct.
Although testing is done under normal market conditions, a more accurate statement would be that testing is done under a wide variety of market conditions, including those that are unusual or extreme. Testing for potential weaknesses, using extreme values as inputs, and testing other models that use the outputs of the subject model as inputs are all important considerations when testing models.
Module 2. Model Validation Process
Topic 1. Model Validation basics
Topic 2. Evaluation of Conceptual Soundness
Topic 3. Ongoing Monitoring
Topic 4. Outcomes Analysis
Topic 5. Vendor and Third-Party Model Risk
Topic 1. Model Validation Basics
-
Model validation involves a series of steps to ensure that models are achieving their intentions.
-
There needs to be a segregation of duties, for example, in that those individuals who develop the models should generally not be the same ones to validate it.
-
Some exceptions may apply in areas that are overly technical or specialized, but in those instances, there must be a rigorous and objective review of such validation.
-
Three elements of a strong validation process are
-
Evaluation of conceptual soundness,
-
Ongoing monitoring, and
-
Outcomes analysis.
-
Topic 2. Evaluation of Conceptual Soundness
- Definition: Overall quality check on the model's construction and design.
-
Validation Steps
-
Examine model documentation and live test results.
-
Review assumptions, computation methods, inputs.
-
Evaluate data representativeness.
-
-
Sensitivity Analysis
-
Assess output changes due to input variations.
-
Test single and multiple input changes.
-
Conduct stress testing under extreme scenarios.
-
- Validate non-quantitative aspects of model development
Topic 3. Ongoing Monitoring
- Continuous Monitoring Framework: Implement frequent ongoing monitoring to determine if models need changes, developments, or replacement based on model purpose and risk exposures.
- Process Verification Controls: Ensure data inputs remain error-free and complete while maintaining internal controls for authorized code changes and audit documentation.
- System Integration Management: Monitor model processing across data sources and address model risk from user-developed spreadsheets, updating systems as data changes.
- Testing and Sensitivity Analysis: Apply the same development testing methods and sensitivity analysis techniques to ongoing monitoring for continued performance validation.
- Override Analysis: Review all model overrides with documentation and investigate frequent overrides as indicators requiring model amendments or overhauls.
- Benchmarking and Variance Assessment: Compare model inputs/outputs with benchmarks, evaluating whether variances fall within acceptable ranges while recognizing methodology differences.
Practice Questions: Q4
Q4. Comparing a model’s inputs and outputs to estimates from other data or models is best described as:
A. benchmarking.
B. ongoing monitoring.
C. process verification.
D. sensitivity analysis.
Practice Questions: Q4 Answer
Explanation: A is correct.
Ongoing monitoring is the general term used to describe various activities that constitute the second key aspect of the validation process. Those activities include benchmarking, process verification, and sensitivity analysis. However, benchmarking is the specific term that applies to comparing a model’s input and outputs to relevant estimates.
Topic 4. Outcomes Analysis
- Performance Assessment: The validation process must examine model outputs against actual outcomes to determine precision levels and assess model performance, requiring model revision when outcomes analysis reveals weaknesses.
- Testing Methodology Selection: Testing methods (both quantitative and qualitative, including expert judgment) should be based on model nature, complexity, data availability, and firm-level model risk, with multiple tests required due to single-test limitations.
- Parallel Outcomes Analysis: Side-by-side examination of original and amended model estimates against actual outcomes determines if amendments provide meaningful improvement before full model replacement.
- Backtesting Requirements: Analysis of variances between actual results and model estimates using different time periods than model development, with confidence intervals established to identify and monitor significant outliers for model specification errors.
- Complex Implementation Challenges: Understanding backtesting results requires reviewing multiple forecasts across various economic environments and time periods, with difficulty in selecting appropriate statistical tests and interpreting results.
- Long-Term Forecasting Solutions: Long forecasting periods require additional short-period testing and "early warning" signals for timely performance measurement, with major problems necessitating amendments, reconstruction, and re-validation before implementation.
Practice Questions: Q5
Q5. Backtesting is most appropriately classified in which element of the validation process?
A. Evaluation of conceptual soundness.
B. Ongoing monitoring.
C. Outcomes analysis.
D. Postvalidation review.
Practice Questions: Q5 Answer
Explanation: A is correct.
Backtesting is a specific type of outcomes analysis.
Topic 5. Vendor and Third-Party Model Risk
- Vendor Model Complexity: Vendor products ranging from data to full models make validation more challenging due to external modeling activities and potential confidential components requiring modified validation approaches.
- Vendor Documentation Requirements: Vendors must provide detailed model construction information and testing results to demonstrate model appropriateness and effectiveness for the user's specific needs.
- Limited Access Validation Focus: When external models restrict access to coding and implementation details, validation processes must emphasize sensitivity analysis and benchmarking approaches.
- Modification Documentation: Any vendor model modifications to suit institutional needs must be clearly noted and explained during the validation process.
- Internal Control Knowledge: Institutions require in-depth knowledge of vendor products from an internal controls perspective to ensure proper oversight and risk management.
Copy of OR 15. Supervisory Guidance on Model Risk Management
By Prateek Yadav
Copy of OR 15. Supervisory Guidance on Model Risk Management
- 47