A comprehensive guide for conducting Digital Sovereignty and Security maturity assessments
Version 1.1 - 7th March 2026This is the 201 - Domain Overview & Assessment guide. Other levels available:
This Level 201 Enablement Guide provides comprehensive instructions for conducting Full Maturity Assessments with customers and partners. It is designed for technical managers, solution architects, and workshop facilitators who need to deliver consistent, high-quality assessments that provide valuable insights into an organization's Digital Sovereignty and Security maturity.
This guide assumes familiarity with basic Digital Sovereignty concepts. If you or your audience are new to Digital Sovereignty, consider starting with the 101 - Introduction guide.
The Full Maturity Assessment is a structured evaluation tool that measures an organization's capabilities across multiple domains using a proven 5-level maturity model based on the CMMI (Capability Maturity Model Integration) framework:
| Level | Name | Range | Description |
|---|---|---|---|
| Level 1 | Initial | 0-20% | Unpredictable, reactive processes; ad-hoc approach |
| Level 2 | Managed | 21-40% | Planned and executed processes; basic controls in place |
| Level 3 | Defined | 41-60% | Standardized and documented processes across organization |
| Level 4 | Quantitatively Managed | 61-80% | Measured and controlled processes with metrics |
| Level 5 | Optimizing | 81-100% | Continuous improvement and innovation |
We offer two primary assessment profiles, each focused on different organizational priorities:
7 Domains: Data Sovereignty, Technical Sovereignty, Operational Sovereignty, Assurance Sovereignty, Open Source, Executive Oversight, Managed Services
Focus: Organizational control and independence from external dependencies, particularly important for government, healthcare, finance, and organizations with strict data residency requirements.
7 Domains: Secure Infrastructure, Secure Data, Secure Identity, Secure Application, Secure Network, Secure Recovery, Secure Operations
Focus: Comprehensive security posture across all layers of the technology stack, ideal for compliance-driven organizations and those with high security requirements.
Most organizations benefit from starting with Digital Sovereignty as it addresses strategic independence concerns. Security assessments can follow to provide deeper technical security insights.
Proper preparation is critical to a successful assessment. Consider the following when scheduling:
The assessment requires input from multiple stakeholders to ensure accurate ratings. Recommended participants:
| Role | Why They're Needed | Essential? |
|---|---|---|
| CIO / CTO | Strategic oversight, budget authority, executive-level questions | Yes |
| CISO / Security Lead | Security controls, risk management, compliance frameworks | Yes |
| Cloud/Infrastructure Lead | Technical sovereignty, infrastructure control, vendor relationships | Yes |
| Compliance/Legal Officer | Data residency, jurisdictional control, regulatory requirements | Recommended |
| Operations Manager | Operational processes, disaster recovery, managed services | Recommended |
| Procurement Lead | Vendor management, supply chain, contract terms | Optional |
Selecting the appropriate Line of Business (LOB) is crucial as it applies industry-specific weightings to domains. Guide your customer through this decision:
Best for: Banks, insurance companies, financial services, payment processors
Emphasized domains: Data Sovereignty (2.0×), Assurance Sovereignty (2.0×), Operational Sovereignty (1.5×)
Rationale: Financial institutions face stringent regulatory requirements (PCI DSS, SOX, DORA) demanding strong data protection, audit controls, and business continuity.
Best for: Hospitals, health systems, medical research, healthcare technology
Emphasized domains: Data Sovereignty (2.0×), Operational Sovereignty (2.0×)
Rationale: Healthcare organizations must protect sensitive patient data (HIPAA, GDPR) while maintaining 24/7 operational resilience for patient safety.
Best for: Federal/state/local government, public sector, defense contractors
Emphasized domains: Data Sovereignty (2.0×), Assurance Sovereignty (2.0×), Executive Oversight (2.0×)
Rationale: Government entities handle sensitive citizen data and critical infrastructure with strict sovereignty requirements, transparency needs, and national security concerns.
Best for: Industrial manufacturing, automotive, aerospace, discrete manufacturing
Emphasized domains: Operational Sovereignty (2.0×), Managed Services (2.0×)
Rationale: Manufacturers prioritize production uptime, OT/IT integration, and IP protection for proprietary designs and processes.
Best for: Telecom providers, ISPs, mobile carriers, network infrastructure
Emphasized domains: Data Sovereignty (2.0×), Operational Sovereignty (2.0×), Assurance Sovereignty (2.0×)
Rationale: Telecom operators manage critical communications infrastructure with subscriber data protection requirements and strict regulatory compliance (NIS2).
Best for: Organizations without specific industry focus or those spanning multiple sectors
Emphasized domains: All domains equally weighted (1.0×)
Rationale: Provides an unbiased assessment across all domains without industry-specific emphasis.
Send this checklist to participants at least 1 week before the assessment:
Before the session, ensure:
A well-structured session keeps participants engaged and ensures comprehensive coverage of all domains.
Explain assessment purpose, maturity model, review agenda, confirm participants and roles
Discuss and select appropriate assessment profile and industry weighting
Work through each domain systematically (~17 min per domain for 7 domains)
Review spider chart, discuss scores, identify obvious gaps
Discuss next steps, schedule follow-up, export results
Sessions often run long as participants want to discuss their challenges. Build in buffer time or be prepared to schedule a continuation session. Consider breaking complex assessments into multiple shorter sessions.
Use this script to open your assessment session professionally:
"Thank you all for joining today's Full Maturity Assessment. Over the next 2-3 hours, we'll be evaluating your organization's capabilities across [Digital Sovereignty / Security] domains using a proven 5-level maturity framework."
"This assessment is designed to be honest and constructive—not punitive. Most organizations score between levels 2-3 initially, and that's perfectly normal. The goal is to establish a baseline and identify priority areas for improvement."
"I'll be asking questions about your current capabilities and asking for evidence of implementation. Please be candid—overestimating maturity only hurts your own planning. If you're unsure about an answer, we can flag it for follow-up."
"Let's start by selecting your industry profile, which will adjust the weighting of domains based on your sector's specific needs..."
Each question has multiple-choice answers corresponding to maturity levels. Guide participants through this process:
Always ask for evidence to support maturity claims. Here are examples of acceptable evidence:
| Level | Acceptable Evidence Examples |
|---|---|
| Level 1 | Verbal confirmation, acknowledgment of gaps, plans to implement |
| Level 2 | Draft policies, project plans, pilot implementations, partial rollouts |
| Level 3 | Approved policies, documented standards, widespread implementation, training records |
| Level 4 | Metrics dashboards, KPI reports, audit logs, automated compliance reporting |
| Level 5 | Continuous improvement programs, innovation initiatives, industry leadership, published case studies |
Example: The CIO believes they have Level 4 disaster recovery, but the Operations Manager says they've never successfully tested it.
Response: "I'm hearing different perspectives here. Let's focus on what we can verify. [Operations Manager], can you describe your most recent DR test? [CIO], what metrics are you using to assess DR maturity? Based on industry best practices, regular testing is required for Level 4. Without test evidence, we should consider Level 2 or 3."
Approach: Stay neutral, ask for evidence, refer to maturity definitions, help them reach consensus based on facts.
Example: After several Level 1-2 scores, the CISO becomes defensive: "We have excellent security! This assessment is unfair!"
Response: "I appreciate your commitment to security. These scores reflect maturity along a journey—they're not a judgment of your team's effort or capability. Many excellent organizations score at Level 2-3 initially. The assessment helps us identify where focused investment will have the most impact. Would it help to review the scoring criteria together?"
Approach: Validate their feelings, emphasize growth mindset, reframe scores as opportunities, avoid blame.
Example: Multiple participants don't know the answer to questions about vendor contracts or key management.
Response: "That's valuable information in itself—if key stakeholders don't know, that typically indicates Level 1 or 2 maturity. Let's mark this for follow-up investigation and make a provisional rating of Level 1. You can update it later once you've verified."
Approach: Frame "don't know" as data, assign conservative rating, offer to revisit, ensure follow-up action item is captured.
Keep the assessment moving while ensuring thoroughness:
This section provides detailed guidance for each Digital Sovereignty domain. Each domain contains 8 questions organized into three tiers:
Each question is assigned points (1-8) reflecting its importance within the domain. Higher point values indicate more critical capabilities for achieving sovereignty. The assessment automatically calculates domain scores based on selected maturity levels and point values.
This domain assesses an organization's ultimate control over its data, independent of external jurisdictions or political influences. It goes beyond basic data residency by focusing on legal control, access, and encryption management. Maturity here confirms that data location is actively governed by the organization's legal and business requirements, rather than dictated solely by a cloud provider or foreign law.
What this measures: Whether the organization explicitly controls where data is stored based on legal requirements
Key questions to ask:
Evidence to request: Data residency policy document, cloud provider contracts specifying regions, configuration screenshots showing geo-restrictions
Red flags: "We think it's in [region]", "The cloud provider handles that", "We haven't checked recently"
What this measures: Compliance with data protection regulations and implementation of privacy controls
Key questions to ask:
Evidence to request: Privacy policies, consent management systems, GDPR compliance documentation, Privacy Impact Assessments
Red flags: Confusion about applicable regulations, no defined process for data subject requests, relying solely on vendor certifications
What this measures: Whether the organization knows what data it has, where it is, and how sensitive it is
Key questions to ask:
Evidence to request: Data inventory/catalog, classification framework document, data discovery tool demonstrations, data ownership registers
Red flags: "We're working on that", manual spreadsheet-based tracking, no data ownership assigned
What this measures: Ability to resist extra-territorial legal demands and maintain domestic legal control
Key questions to ask:
Evidence to request: Vendor contracts showing governing law clauses, legal risk register, documented escalation procedures
Red flags: Contracts governed by foreign law, no notification provisions, unaware of jurisdictional conflicts
What this measures: Whether the organization exclusively controls encryption keys, independent of cloud providers
Key questions to ask:
Evidence to request: Key management architecture diagrams, HSM procurement/contracts, key rotation policies, external key management (EKM) solution documentation
Red flags: Provider-managed keys, lack of HSMs, no key rotation schedule, unclear about who can access keys
Note: This is a 6-point question because key control is fundamental to data sovereignty. Organizations often struggle here.
What this measures: Protection of data during processing (data-in-use), not just storage and transit
Key questions to ask:
Evidence to request: Confidential computing implementations (Intel SGX, AMD SEV, AWS Nitro Enclaves), memory encryption configurations, log sanitization policies
Red flags: Unaware of data-in-use protection, relying only on at-rest and in-transit encryption, plaintext logging of sensitive data
Note: This is often Level 1-2 for most organizations; confidential computing is still emerging.
What this measures: Real-time monitoring and immutable logging of all data movements
Key questions to ask:
Evidence to request: Data flow maps, DLP dashboards, audit log retention policies, SIEM integration, transfer blocking evidence
Red flags: No data flow visibility, reactive rather than preventive controls, logs stored with cloud provider
What this measures: Strict, audited, and revocable control over vendor and partner access to data
Key questions to ask:
Evidence to request: Third-party access policies, Privileged Access Management (PAM) systems, session recordings, vendor risk assessments
Red flags: Persistent vendor access, no session monitoring, vendors located in concerning jurisdictions, inability to quickly revoke access
Note: This is the highest point value question as third-party access is a primary sovereignty risk.
Technical Sovereignty evaluates the degree of control an organization maintains over the foundational components of its technology stack—from hardware and firmware to application source code and runtime environments. High maturity signifies deliberate reduction in reliance on proprietary interfaces and single-vendor ecosystems, ensuring the ability to rebuild or migrate critical functions if necessary.
Key Focus Areas: Technology stack ownership, vendor lock-in mitigation, standardized frameworks, interoperability, hardware provenance, self-hosted runtimes, IP control, future-proofing
Common Discussion Topics: Open source adoption, Kubernetes and containerization, multi-cloud strategies, escrow agreements, supply chain security
This domain examines the organization's autonomy and independence in executing critical business and IT operations. It ensures that essential functions can be performed without reliance on external human expertise or infrastructure outside the organization's direct control or trusted sovereign borders.
Key Focus Areas: Process documentation, managed service dependencies, IAM, internal skills, disaster recovery, supply chain vetting, incident response, operational autonomy
Common Discussion Topics: Break-glass procedures, in-house vs. outsourced operations, business continuity planning, geopolitical isolation scenarios
Assurance Sovereignty addresses the right, capability, and transparency required to verify the security and compliance claims of both internal systems and external vendors. It's the mechanism by which trust is verified, not assumed, through independent audits and continuous technical validation.
Key Focus Areas: Audit rights, sovereign SIEM, compliance verification, transparency requirements, sovereign certifications, continuous monitoring, security testing, vulnerability management
Common Discussion Topics: Right to audit clauses, SOC 2 Type II, penetration testing, third-party attestations, domestic vs. foreign auditors
This domain assesses the organization's strategic use of open-source software to reduce proprietary dependencies, increase transparency, and build internal capabilities. Mature organizations actively contribute to and influence open-source projects critical to their sovereignty goals.
Key Focus Areas: Open source strategy, community participation, license compliance, vulnerability management, sovereign distributions, contribution policies, internal expertise, project governance
Common Discussion Topics: Red Hat Enterprise Linux, Kubernetes, Apache projects, InnerSource, security scanning, open source vs. commercial support
Executive Oversight ensures that sovereignty concerns are understood, prioritized, and actively managed at the highest levels of the organization. This domain measures board and C-suite engagement, dedicated budgets, governance structures, and accountability for sovereignty outcomes.
Key Focus Areas: Board awareness, dedicated governance, budget allocation, sovereignty policies, risk management, accountability, strategic planning, regulatory engagement
Common Discussion Topics: Board reporting, sovereignty champions, dedicated budgets vs. embedded costs, KPIs and metrics, regulatory relationships
This domain evaluates how the organization manages relationships with external managed service providers while maintaining sovereignty. It addresses vendor selection criteria, contractual controls, geographic restrictions, transition planning, and the balance between operational efficiency and sovereign control.
Key Focus Areas: Vendor selection criteria, contractual controls, geographic restrictions, data access limitations, performance monitoring, transition planning, alternatives evaluation, insourcing capabilities
Common Discussion Topics: Domestic vs. foreign MSPs, data center locations, support personnel jurisdictions, exit strategies, dual-source strategies
After completing the assessment, guide the customer through understanding their results.
The spider/radar chart provides a visual representation of maturity across all domains:
Most organizations on their first assessment score:
Reassure customers that these results are normal starting points, not failures.
Work with the customer to translate scores into actionable priorities:
| Priority | Criteria | Example |
|---|---|---|
| Critical (0-3 months) | Regulatory requirement, high industry weighting, Level 1 on high-point questions | Implementing HSM-based key management for healthcare patient data |
| High (3-6 months) | Significant sovereignty risk, medium weighting, Level 2 on strategic questions | Establishing sovereign audit rights with cloud providers |
| Medium (6-12 months) | Important capability, standard weighting, Level 2-3 on foundation questions | Implementing data classification and discovery tools |
| Low (12+ months) | Optimization, already at Level 3+, advanced questions | Establishing open source contribution programs |
Policy development, basic controls, compliance alignment, data inventory, vendor assessment
Technical implementations, key management, vendor migrations, skills development, tooling deployment
Continuous monitoring, optimization, innovation, industry leadership, operational autonomy
Help customers export and distribute results appropriately:
Schedule follow-up activities to maintain momentum:
The assessment often reveals opportunities for further engagement:
When conducting assessments remotely:
Behavior: Claims high maturity without evidence, dismisses concerns, believes "we have the best security"
Approach: Acknowledge their confidence, then request specific evidence. Use data and industry benchmarks. Ask their technical team to verify claims. Frame lower scores as "industry-standard journey" rather than failures.
Behavior: Late to session, distracted, checking phone, wants to rush through
Approach: Respectfully emphasize the value of their time investment. Show early results to demonstrate value. Offer to reschedule if they can't focus. Break into shorter sessions if needed.
Behavior: Takes low scores personally, explains why gaps aren't their fault, blames budget/management
Approach: Emphasize this is organizational assessment, not personal evaluation. Validate resource constraints. Position results as ammunition for budget requests. Frame gaps as opportunities to demonstrate need for investment.
Behavior: Debates every nuance, wants to discuss technical details extensively, struggles to choose between maturity levels
Approach: Appreciate their thoroughness. Set time limits for each question. Offer to deep-dive on specific topics afterward. Remind that perfect accuracy is less important than directional understanding. Use "parking lot" for detailed technical discussions.
Ready-to-use templates are available to support your assessment delivery:
Visit the Templates Library for:
| Term | Definition |
|---|---|
| BYOK | Bring Your Own Key - Customer-generated encryption keys imported to cloud provider |
| CLOUD Act | US law allowing government access to data held by US companies regardless of location |
| Confidential Computing | Protection of data during processing using hardware-based secure enclaves |
| Data Residency | Physical location where data is stored |
| Data Sovereignty | Legal and technical control over data, including ability to resist foreign access demands |
| DLP | Data Loss Prevention - Tools to monitor and prevent unauthorized data transfers |
| EKM | External Key Management - Encryption keys managed outside cloud provider infrastructure |
| HSM | Hardware Security Module - Dedicated cryptographic processor for key management |
| PAM | Privileged Access Management - System for controlling and monitoring administrative access |
| SCA | Software Composition Analysis - Scanning third-party code for vulnerabilities |
| TEE | Trusted Execution Environment - Secure area of processor for sensitive operations |
| Zero Trust | Security model assuming no implicit trust, requiring verification for all access |
Subject: Preparation for Digital Sovereignty Maturity Assessment - [Date]
Dear [Stakeholders],
Thank you for scheduling a Full Maturity Assessment. This session will evaluate your organization's Digital Sovereignty capabilities across 7 key domains using a proven 5-level maturity framework.
Session Details:
Date/Time: [Date/Time]
Location/Link: [Details]
Required Participants: CIO/CTO, CISO, Cloud/Infrastructure Lead, Compliance Officer
Please prepare:
Looking forward to our session.
Best regards,
[Your Name]
Subject: Digital Sovereignty Assessment Results and Next Steps
Dear [Stakeholders],
Thank you for participating in yesterday's maturity assessment. Your engagement and candor were excellent.
Key Findings:
Attached you'll find:
Recommended Next Steps:
I'll follow up next week to schedule our roadmap session.
Best regards,
[Your Name]
| Level | Key Indicators | Common Language |
|---|---|---|
| 1 | No policy, ad-hoc, reactive, "we're planning to" | "We know we need to do this" |
| 2 | Draft policies, pilots, project plans, some implementation | "We're working on it" |
| 3 | Approved policies, widespread deployment, documented standards | "We have this in place" |
| 4 | Metrics, dashboards, KPIs, regular reporting, measured outcomes | "We measure and optimize this" |
| 5 | Continuous improvement, innovation, industry leadership | "We're leading the industry" |