# ISRA Questionnaire

Enter the **Information Security Risk Assessment (ISRA) Questionnaire**: a structured tool designed to evaluate risks, verify controls, and recommend mitigations for IT projects.

This enhanced guide expands on the original ISRA Questionnaire by:

* **Streamlining the structure**: Organized into core domains (e.g., Confidentiality, Integrity) with clear question sets.
* **Adding practical value**: Includes scoring guidance, risk heat maps, and real-world examples tied to common project changes (see [ISRA Project Change Scope Definition](https://calvin-lai.gitbook.io/calvin-lai-security/risk-management/it-risk-and-control-library-policy-and-procedure/itra-project-change-scope-definition)).
* **Integration tips**: How to link responses to controls from frameworks like NIST, ISO 27001, or your organization's IT Risk and Control Library.
* **Templates and tools**: Downloadable checklist and Excel scoring sheet (hypothetical link: [ISRA Toolkit](https://calvin-lai.gitbook.io/calvin-lai-security/resources/itra-toolkit)).

**When to Use**: Trigger the ISRA for any project matching the [17 change scopes](https://calvin-lai.gitbook.io/calvin-lai-security/risk-management/it-risk-and-control-library-policy-and-procedure/itra-project-change-scope-definition), such as new cloud adoptions or code updates. Aim to complete it during project planning, with input from IT, security, and business stakeholders.

**Expected Outcomes**:

* **Risk Score**: Quantitative rating (Low/Medium/High) to prioritize projects.
* **Control Gaps**: Actionable recommendations.
* **Approval Gate**: Sign-off before go-live.

***

### Questionnaire Structure

The ISRA is divided into **5 key domains**, each with 5-8 targeted questions. For each:

* **Rate the Control**: Use a 1-5 scale (1 = No Control, 5 = Fully Effective).
* **Evidence Required**: List docs/tests (e.g., logs, policies).
* **Risk Impact**: Inherent (project-specific) vs. Residual (post-controls).

#### Domain 1: Confidentiality (Protecting Data from Unauthorized Disclosure)

Focus: Encryption, access restrictions. High relevance for changes like new databases (#8) or data flows (#16).

| Question                                                                              | Guidance                                                                  | Scoring Notes                                  |
| ------------------------------------------------------------------------------------- | ------------------------------------------------------------------------- | ---------------------------------------------- |
| 1.1 Is sensitive data classified (e.g., PII, financials) and handled per policy?      | Map to your data classification scheme.                                   | Low score if unclassified—triggers DLP review. |
| 1.2 Are encryption controls (at rest/transit) implemented for new/changed data paths? | Check TLS 1.3+ for transit; AES-256 for rest. Example: API changes (#10). | Evidence: Cert scans or config audits.         |
| 1.3 Does the change introduce third-party data sharing? If yes, is a DPA in place?    | Review vendor contracts for GDPR/CCPA compliance.                         | High risk if no—recommend legal review.        |
| 1.4 Are logging/monitoring in place to detect unauthorized access?                    | Integrate with SIEM; retain logs 90+ days.                                | Test with simulated breach.                    |
| 1.5 For remote access (#17), is data masked or tokenized?                             | Use zero-trust tools like ZTNA.                                           | Residual risk low only with MFA + encryption.  |

**Domain Score Calculation**: Average ratings. Threshold: <3 = Medium Risk → Escalate.

**Example**: For a new cloud resource (#12), weak encryption scores a 2, flagging AWS S3 bucket misconfigs.

***

#### Domain 2: Integrity (Ensuring Data Accuracy and Reliability)

Focus: Validation, backups. Critical for code changes (#3) or migrations (#8).

| Question                                                                     | Guidance                                                              | Scoring Notes                              |
| ---------------------------------------------------------------------------- | --------------------------------------------------------------------- | ------------------------------------------ |
| 2.1 Are input validations (e.g., sanitization) applied to prevent tampering? | OWASP cheatsheet for web/apps. Example: Internet-facing updates (#4). | Low if no—vulnerable to injection attacks. |
| 2.2 Is data integrity verified during migrations or transfers?               | Use checksums (e.g., MD5/SHA-256).                                    | Evidence: Pre/post-migration reports.      |
| 2.3 Are backups tested and retained per RPO/RTO?                             | Automate with immutable storage.                                      | High impact for critical systems (#17).    |
| 2.4 Does the change affect audit trails (e.g., non-repudiation)?             | Enable immutable logs.                                                | Tie to change management policy.           |
| 2.5 For hardware changes (#7), is firmware integrity checked?                | Signatures via TPM.                                                   | Residual risk: Supply chain attacks.       |
| 2.6 Are change controls (e.g., CI/CD pipelines) versioned?                   | Git for code (#3); IaC for infra.                                     | Score 5 only with peer reviews.            |

**Domain Score Calculation**: Weighted average (e.g., backups = 2x weight). >4 = Low Risk.

**Example**: Database migration (#8) without checksums? Score 1—recommend rollback plans.

***

#### Domain 3: Availability (Maintaining System Uptime)

Focus: Resilience, redundancy. Key for architecture shifts (#2) or network changes (#9).

| Question                                                            | Guidance                                       | Scoring Notes                          |
| ------------------------------------------------------------------- | ---------------------------------------------- | -------------------------------------- |
| 3.1 Does the change include failover/redundancy (e.g., multi-AZ)?   | Design for 99.9% SLA. Example: Dual-site (#2). | Evidence: DR test results.             |
| 3.2 Are DDoS protections scaled for internet-facing systems (#4)?   | WAF + CDN.                                     | High risk in public exposure.          |
| 3.3 Is capacity planning done for scaling (e.g., auto-scaling #12)? | Monitor with Prometheus.                       | Low score = Performance bottlenecks.   |
| 3.4 For endpoint devices (#13), is offline resilience enabled?      | Caching/sync mechanisms.                       | Test with network outage sim.          |
| 3.5 Are incident response plans updated for the change?             | Run tabletop exercise.                         | Mandatory for lifeblood systems (#17). |
| 3.6 Does remote access (#17) include session timeouts?              | 15-min idle + geo-fencing.                     | Residual: Zero if monitored.           |

**Domain Score Calculation**: Average. <2 = High Risk → Halt project.

**Example**: New VPN (#9) without redundancy? Potential single-point failure—add load balancers.

***

#### Domain 4: Authentication & Authorization (Controlling Access)

Focus: IAM, least privilege. Ties to auth changes (#15) or keys (#14).

| Question                                                         | Guidance                                                     | Scoring Notes                        |
| ---------------------------------------------------------------- | ------------------------------------------------------------ | ------------------------------------ |
| 4.1 Is MFA enforced for all elevated access?                     | Phish-resistant (e.g., FIDO2). Example: RBAC redesign (#15). | No exceptions without justification. |
| 4.2 Are keys/certificates rotated and managed (e.g., via Vault)? | Automate 90-day cycles (#14).                                | Evidence: Rotation logs.             |
| 4.3 Does the change segregate duties (SoD)?                      | Review RBAC matrices.                                        | High risk in finance/critical apps.  |
| 4.4 For APIs (#10), is auth robust (OAuth/JWT)?                  | Rate limiting + scopes.                                      | Test with Burp Suite.                |
| 4.5 Are privileged accounts monitored (e.g., just-in-time)?      | PAM tools like CyberArk.                                     | Low score = Insider threat exposure. |
| 4.6 For cloud adoption (#11), is federated identity used?        | SAML/Okta integration.                                       | Align with shared responsibility.    |

**Domain Score Calculation**: Strict average. 3+ = Proceed with audit.

**Example**: Key rotation (#14) skipped? Score 1—immediate remediation.

***

#### Domain 5: Compliance & Governance (Regulatory Alignment)

Focus: Auditing, reporting. Overarching for all scopes, especially new systems (#1).

| Question                                                              | Guidance                                   | Scoring Notes                           |
| --------------------------------------------------------------------- | ------------------------------------------ | --------------------------------------- |
| 5.1 Does the change comply with relevant regs (e.g., PCI-DSS, HIPAA)? | Gap analysis tool.                         | Evidence: Certs or attestations.        |
| 5.2 Is the project documented in the change register?                 | Link to ISMS (#1).                         | Automated ticketing (e.g., ServiceNow). |
| 5.3 Have third-party risks been assessed (e.g., SOC2)?                | Vendor questionnaire. Example: SaaS (#11). | High if unvetted.                       |
| 5.4 Is training provided for new processes/tools?                     | Role-based modules.                        | For user-facing changes (#13).          |
| 5.5 Are metrics tracked post-implementation (e.g., risk KPIs)?        | Dashboard in PowerBI.                      | Continuous monitoring loop.             |
| 5.6 For architecture changes (#2), is a security design review done?  | SDLC gate.                                 | Residual: Low with sign-off.            |
| 5.7 Is decommissioning planned for legacy components?                 | Secure wipe (NIST 800-88).                 | Avoid zombie systems.                   |

**Domain Score Calculation**: Holistic average. Influences overall project rating.

**Example**: New system (#1) without compliance map? Flag for legal.

***

### Risk Scoring and Heat Map

Aggregate domain scores into an **Overall Risk Level**:

* **Low (Avg >4)**: Green – Proceed.
* **Medium (Avg 2.5-4)**: Yellow – Mitigate + Review.
* **High (Avg <2.5)**: Red – Pause + Escalate.

Visualize with a simple heat map (adapt for your blog):

| Domain          | Score    | Risk Level | Actions              |
| --------------- | -------- | ---------- | -------------------- |
| Confidentiality | 3.2      | Medium     | Encrypt data flows.  |
| Integrity       | 4.1      | Low        | N/A.                 |
| Availability    | 2.8      | Medium     | Add redundancy.      |
| Auth/Authz      | 3.8      | Low        | N/A.                 |
| Compliance      | 2.9      | Medium     | Conduct audit.       |
| **Overall**     | **3.36** | **Medium** | Quarterly follow-up. |

**Pro Tip**: Use Excel formulas for auto-scoring: `=AVERAGE(B2:B6)` with conditional formatting.

***

### How to Implement the ISRA Questionnaire

1. **Kickoff**: Assign a facilitator (e.g., IT Risk Owner). Distribute 1 week pre-meeting.
2. **Workshop**: 1-2 hour session with stakeholders. Use the [Project Change Scope](https://calvin-lai.gitbook.io/calvin-lai-security/risk-management/it-risk-and-control-library-policy-and-procedure/itra-project-change-scope-definition) to confirm applicability.
3. **Analysis**: Calculate scores; map gaps to controls from your [IT Risk and Control Library](https://calvin-lai.gitbook.io/calvin-lai-security/risk-management/it-risk-and-control-library-policy-and-procedure).
4. **Remediation**: Prioritize fixes (e.g., Quick Wins vs. Long-Term). Re-assess post-mitigation.
5. **Reporting**: Template email: "ISRA Complete – Medium Risk. Key Actions: \[List]."

**Common Pitfalls & Fixes**:

* **Over-Scoping**: Stick to triggered changes—use the scope doc as a filter.
* **Bias in Scoring**: Anonymize responses or use peer calibration.
* **Stale Data**: Review annually or post-incident.

***

### Resources and Next Steps

* **Toolkit**: [Download ISRA Excel Template](https://calvin-lai.gitbook.io/calvin-lai-security/resources/itra-questionnaire-template.xlsx) – Pre-populated with formulas.
* **Related Reads**: [ISRA Project Change Scope](https://calvin-lai.gitbook.io/calvin-lai-security/risk-management/it-risk-and-control-library-policy-and-procedure/itra-project-change-scope-definition) | [Risk Scoring Matrix](https://calvin-lai.gitbook.io/calvin-lai-security/risk-management/it-risk-and-control-library-policy-and-procedure/risk-scoring).
* **Customize It**: Tailor questions to your industry (e.g., add FINRA for finance).

This questionnaire isn't just a form—it's your shield against IT surprises. Questions or tweaks? Drop a comment or reach out. Stay secure! 🔒
