Red Hat Logo
LEVEL 201 - DOMAIN OVERVIEW & ASSESSMENT

Full Maturity Assessment Enablement Guide

A comprehensive guide for conducting Digital Sovereignty and Security maturity assessments

Version 1.1 - 7th March 2026

Table of Contents

Introduction

Enablement Guide Levels

This is the 201 - Domain Overview & Assessment guide. Other levels available:

Purpose of This Guide

This Level 201 Enablement Guide provides comprehensive instructions for conducting Full Maturity Assessments with customers and partners. It is designed for technical managers, solution architects, and workshop facilitators who need to deliver consistent, high-quality assessments that provide valuable insights into an organization's Digital Sovereignty and Security maturity.

This guide assumes familiarity with basic Digital Sovereignty concepts. If you or your audience are new to Digital Sovereignty, consider starting with the 101 - Introduction guide.

What is a Full Maturity Assessment?

The Full Maturity Assessment is a structured evaluation tool that measures an organization's capabilities across multiple domains using a proven 5-level maturity model based on the CMMI (Capability Maturity Model Integration) framework:

Level Name Range Description
Level 1 Initial 0-20% Unpredictable, reactive processes; ad-hoc approach
Level 2 Managed 21-40% Planned and executed processes; basic controls in place
Level 3 Defined 41-60% Standardized and documented processes across organization
Level 4 Quantitatively Managed 61-80% Measured and controlled processes with metrics
Level 5 Optimizing 81-100% Continuous improvement and innovation

Assessment Profiles

We offer two primary assessment profiles, each focused on different organizational priorities:

Digital Sovereignty

7 Domains: Data Sovereignty, Technical Sovereignty, Operational Sovereignty, Assurance Sovereignty, Open Source, Executive Oversight, Managed Services

Focus: Organizational control and independence from external dependencies, particularly important for government, healthcare, finance, and organizations with strict data residency requirements.

Security

7 Domains: Secure Infrastructure, Secure Data, Secure Identity, Secure Application, Secure Network, Secure Recovery, Secure Operations

Focus: Comprehensive security posture across all layers of the technology stack, ideal for compliance-driven organizations and those with high security requirements.

Tip

Most organizations benefit from starting with Digital Sovereignty as it addresses strategic independence concerns. Security assessments can follow to provide deeper technical security insights.

Pre-Assessment Preparation

Scheduling the Assessment

Proper preparation is critical to a successful assessment. Consider the following when scheduling:

Time Requirements

  • Full Assessment: 2-4 hours depending on organization size and complexity
  • Quick Assessment: 1-2 hours (covering only Foundation tier questions)
  • Follow-up Session: 1 hour (for results review and roadmap planning)

Participant Selection

The assessment requires input from multiple stakeholders to ensure accurate ratings. Recommended participants:

Role Why They're Needed Essential?
CIO / CTO Strategic oversight, budget authority, executive-level questions Yes
CISO / Security Lead Security controls, risk management, compliance frameworks Yes
Cloud/Infrastructure Lead Technical sovereignty, infrastructure control, vendor relationships Yes
Compliance/Legal Officer Data residency, jurisdictional control, regulatory requirements Recommended
Operations Manager Operational processes, disaster recovery, managed services Recommended
Procurement Lead Vendor management, supply chain, contract terms Optional

Industry (LOB) Selection

Selecting the appropriate Line of Business (LOB) is crucial as it applies industry-specific weightings to domains. Guide your customer through this decision:

Finance

Best for: Banks, insurance companies, financial services, payment processors

Emphasized domains: Data Sovereignty (2.0×), Assurance Sovereignty (2.0×), Operational Sovereignty (1.5×)

Rationale: Financial institutions face stringent regulatory requirements (PCI DSS, SOX, DORA) demanding strong data protection, audit controls, and business continuity.

Healthcare

Best for: Hospitals, health systems, medical research, healthcare technology

Emphasized domains: Data Sovereignty (2.0×), Operational Sovereignty (2.0×)

Rationale: Healthcare organizations must protect sensitive patient data (HIPAA, GDPR) while maintaining 24/7 operational resilience for patient safety.

Government

Best for: Federal/state/local government, public sector, defense contractors

Emphasized domains: Data Sovereignty (2.0×), Assurance Sovereignty (2.0×), Executive Oversight (2.0×)

Rationale: Government entities handle sensitive citizen data and critical infrastructure with strict sovereignty requirements, transparency needs, and national security concerns.

Manufacturing

Best for: Industrial manufacturing, automotive, aerospace, discrete manufacturing

Emphasized domains: Operational Sovereignty (2.0×), Managed Services (2.0×)

Rationale: Manufacturers prioritize production uptime, OT/IT integration, and IP protection for proprietary designs and processes.

Telecommunications

Best for: Telecom providers, ISPs, mobile carriers, network infrastructure

Emphasized domains: Data Sovereignty (2.0×), Operational Sovereignty (2.0×), Assurance Sovereignty (2.0×)

Rationale: Telecom operators manage critical communications infrastructure with subscriber data protection requirements and strict regulatory compliance (NIS2).

Balanced / Other

Best for: Organizations without specific industry focus or those spanning multiple sectors

Emphasized domains: All domains equally weighted (1.0×)

Rationale: Provides an unbiased assessment across all domains without industry-specific emphasis.

Pre-Assessment Checklist

Send this checklist to participants at least 1 week before the assessment:

Pre-Assessment Information Needed

  • Current cloud infrastructure provider(s) and services used
  • List of critical business applications and their hosting locations
  • Existing compliance frameworks and certifications (ISO 27001, SOC 2, etc.)
  • Data classification policies and data residency requirements
  • Key vendor relationships and managed service providers
  • Recent security audits or risk assessments
  • Disaster recovery and business continuity documentation
  • Open source usage policies (if applicable)

Technical Setup

Before the session, ensure:

Facilitation Methodology

Workshop Structure

A well-structured session keeps participants engaged and ensures comprehensive coverage of all domains.

0:00-0:15 - Introduction & Context Setting (15 min)

Explain assessment purpose, maturity model, review agenda, confirm participants and roles

0:15-0:25 - Profile & Industry Selection (10 min)

Discuss and select appropriate assessment profile and industry weighting

0:25-2:25 - Domain Assessment (120 min)

Work through each domain systematically (~17 min per domain for 7 domains)

2:25-2:45 - Results Review (20 min)

Review spider chart, discuss scores, identify obvious gaps

2:45-3:00 - Next Steps & Wrap-up (15 min)

Discuss next steps, schedule follow-up, export results

Time Management

Sessions often run long as participants want to discuss their challenges. Build in buffer time or be prepared to schedule a continuation session. Consider breaking complex assessments into multiple shorter sessions.

Opening Script

Use this script to open your assessment session professionally:

Sample Opening

"Thank you all for joining today's Full Maturity Assessment. Over the next 2-3 hours, we'll be evaluating your organization's capabilities across [Digital Sovereignty / Security] domains using a proven 5-level maturity framework."

"This assessment is designed to be honest and constructive—not punitive. Most organizations score between levels 2-3 initially, and that's perfectly normal. The goal is to establish a baseline and identify priority areas for improvement."

"I'll be asking questions about your current capabilities and asking for evidence of implementation. Please be candid—overestimating maturity only hurts your own planning. If you're unsure about an answer, we can flag it for follow-up."

"Let's start by selecting your industry profile, which will adjust the weighting of domains based on your sector's specific needs..."

Question-by-Question Guidance

How to Score Each Question

Each question has multiple-choice answers corresponding to maturity levels. Guide participants through this process:

  1. Read the question aloud - Ensure everyone understands what's being asked
  2. Review all options - Read through all maturity levels before deciding
  3. Ask for evidence - "Can you show me documentation/tools/policies that demonstrate this?"
  4. Probe deeper - "Walk me through how this actually works in practice"
  5. Watch for inflation - Organizations often overestimate; look for concrete proof
  6. Seek consensus - If participants disagree, facilitate discussion to reach agreement
  7. Document notes - Use the notes field to capture important context

Evidence-Based Assessment

Always ask for evidence to support maturity claims. Here are examples of acceptable evidence:

Level Acceptable Evidence Examples
Level 1 Verbal confirmation, acknowledgment of gaps, plans to implement
Level 2 Draft policies, project plans, pilot implementations, partial rollouts
Level 3 Approved policies, documented standards, widespread implementation, training records
Level 4 Metrics dashboards, KPI reports, audit logs, automated compliance reporting
Level 5 Continuous improvement programs, innovation initiatives, industry leadership, published case studies

Handling Difficult Conversations

Scenario: Stakeholders Disagree on Rating

Example: The CIO believes they have Level 4 disaster recovery, but the Operations Manager says they've never successfully tested it.

Response: "I'm hearing different perspectives here. Let's focus on what we can verify. [Operations Manager], can you describe your most recent DR test? [CIO], what metrics are you using to assess DR maturity? Based on industry best practices, regular testing is required for Level 4. Without test evidence, we should consider Level 2 or 3."

Approach: Stay neutral, ask for evidence, refer to maturity definitions, help them reach consensus based on facts.

Scenario: Customer is Defensive About Low Scores

Example: After several Level 1-2 scores, the CISO becomes defensive: "We have excellent security! This assessment is unfair!"

Response: "I appreciate your commitment to security. These scores reflect maturity along a journey—they're not a judgment of your team's effort or capability. Many excellent organizations score at Level 2-3 initially. The assessment helps us identify where focused investment will have the most impact. Would it help to review the scoring criteria together?"

Approach: Validate their feelings, emphasize growth mindset, reframe scores as opportunities, avoid blame.

Scenario: "We Don't Know" Responses

Example: Multiple participants don't know the answer to questions about vendor contracts or key management.

Response: "That's valuable information in itself—if key stakeholders don't know, that typically indicates Level 1 or 2 maturity. Let's mark this for follow-up investigation and make a provisional rating of Level 1. You can update it later once you've verified."

Approach: Frame "don't know" as data, assign conservative rating, offer to revisit, ensure follow-up action item is captured.

Maintaining Momentum

Keep the assessment moving while ensuring thoroughness:

Domain Deep-Dives

This section provides detailed guidance for each Digital Sovereignty domain. Each domain contains 8 questions organized into three tiers:

Understanding Points

Each question is assigned points (1-8) reflecting its importance within the domain. Higher point values indicate more critical capabilities for achieving sovereignty. The assessment automatically calculates domain scores based on selected maturity levels and point values.

Domain 1: Data Sovereignty

Domain Overview

This domain assesses an organization's ultimate control over its data, independent of external jurisdictions or political influences. It goes beyond basic data residency by focusing on legal control, access, and encryption management. Maturity here confirms that data location is actively governed by the organization's legal and business requirements, rather than dictated solely by a cloud provider or foreign law.

Key Concepts to Explain

Common Customer Misconceptions

Watch Out For

  • "We use a local cloud region, so we have data sovereignty" - Physical location alone doesn't guarantee sovereignty if the provider is subject to foreign law
  • "Encryption protects our sovereignty" - Not if the cloud provider controls the keys or can be compelled to decrypt
  • "GDPR compliance means we have data sovereignty" - Compliance is necessary but not sufficient for true sovereignty
  • "We don't have sensitive data" - Most organizations underestimate the sensitivity and value of their data assets

Domain 1 Question Guide

Q1: Data Residency & Location (1 point - Foundation)

What this measures: Whether the organization explicitly controls where data is stored based on legal requirements

Key questions to ask:

  • "Do you have a written data residency policy?"
  • "Can you show me which cloud regions your data is stored in?"
  • "How do you prevent data from being accidentally stored outside approved regions?"
  • "What happens if a cloud provider wants to move your data for operational reasons?"

Evidence to request: Data residency policy document, cloud provider contracts specifying regions, configuration screenshots showing geo-restrictions

Red flags: "We think it's in [region]", "The cloud provider handles that", "We haven't checked recently"

Q2: Data Protection & Privacy (2 points - Foundation)

What this measures: Compliance with data protection regulations and implementation of privacy controls

Key questions to ask:

  • "Which data protection regulations apply to you? (GDPR, CCPA, PIPL, etc.)"
  • "How do you handle data subject rights requests (access, deletion, portability)?"
  • "Do you have a Data Protection Officer or equivalent role?"
  • "How are cross-border data transfers authorized and tracked?"

Evidence to request: Privacy policies, consent management systems, GDPR compliance documentation, Privacy Impact Assessments

Red flags: Confusion about applicable regulations, no defined process for data subject requests, relying solely on vendor certifications

Q3: Data Classification and Inventory (3 points - Foundation)

What this measures: Whether the organization knows what data it has, where it is, and how sensitive it is

Key questions to ask:

  • "Do you have a complete inventory of your data assets?"
  • "What classification levels do you use? (Public, Internal, Confidential, Restricted)"
  • "How do you discover and classify new data automatically?"
  • "Who owns each data asset and is accountable for its protection?"

Evidence to request: Data inventory/catalog, classification framework document, data discovery tool demonstrations, data ownership registers

Red flags: "We're working on that", manual spreadsheet-based tracking, no data ownership assigned

Q4: Legal & Jurisdictional Control (4 points - Strategic)

What this measures: Ability to resist extra-territorial legal demands and maintain domestic legal control

Key questions to ask:

  • "What jurisdiction's law governs your cloud contracts?"
  • "How would you respond to a foreign government data access request?"
  • "Do you have contractual provisions requiring vendors to notify you of legal demands?"
  • "Have you assessed conflicts between foreign laws (CLOUD Act) and domestic requirements?"

Evidence to request: Vendor contracts showing governing law clauses, legal risk register, documented escalation procedures

Red flags: Contracts governed by foreign law, no notification provisions, unaware of jurisdictional conflicts

Q5: Cryptographic Key Management Control (6 points - Strategic)

What this measures: Whether the organization exclusively controls encryption keys, independent of cloud providers

Key questions to ask:

  • "Who generates and stores your encryption keys?"
  • "Can your cloud provider access your encryption keys?"
  • "Do you use HSMs (Hardware Security Modules)? Where are they located?"
  • "How frequently do you rotate encryption keys?"
  • "What would happen if your provider received a legal demand to decrypt your data?"

Evidence to request: Key management architecture diagrams, HSM procurement/contracts, key rotation policies, external key management (EKM) solution documentation

Red flags: Provider-managed keys, lack of HSMs, no key rotation schedule, unclear about who can access keys

Note: This is a 6-point question because key control is fundamental to data sovereignty. Organizations often struggle here.

Q6: Workload Data Protection & Privacy (5 points - Strategic)

What this measures: Protection of data during processing (data-in-use), not just storage and transit

Key questions to ask:

  • "How do you protect data while it's being processed in memory?"
  • "Are you using confidential computing or Trusted Execution Environments (TEEs)?"
  • "Can cloud administrators access data in memory during processing?"
  • "How do you ensure sensitive data isn't logged in plaintext?"

Evidence to request: Confidential computing implementations (Intel SGX, AMD SEV, AWS Nitro Enclaves), memory encryption configurations, log sanitization policies

Red flags: Unaware of data-in-use protection, relying only on at-rest and in-transit encryption, plaintext logging of sensitive data

Note: This is often Level 1-2 for most organizations; confidential computing is still emerging.

Q7: Data Flow and Transfer Auditing (7 points - Advanced)

What this measures: Real-time monitoring and immutable logging of all data movements

Key questions to ask:

  • "Can you show me where your data flows across systems?"
  • "Do you have Data Loss Prevention (DLP) tools deployed?"
  • "How do you monitor and prevent unauthorized data transfers?"
  • "Are your audit logs immutable and stored sovereignly?"
  • "How quickly can you detect an unauthorized cross-border data transfer?"

Evidence to request: Data flow maps, DLP dashboards, audit log retention policies, SIEM integration, transfer blocking evidence

Red flags: No data flow visibility, reactive rather than preventive controls, logs stored with cloud provider

Q8: Data Access by Third Parties Policies (8 points - Advanced)

What this measures: Strict, audited, and revocable control over vendor and partner access to data

Key questions to ask:

  • "Which third parties have access to your data? Why?"
  • "Do you use Just-in-Time (JIT) access for vendor support?"
  • "How do you monitor and record third-party access sessions?"
  • "Can you immediately revoke vendor access in an emergency?"
  • "Where are vendor support personnel located geographically?"

Evidence to request: Third-party access policies, Privileged Access Management (PAM) systems, session recordings, vendor risk assessments

Red flags: Persistent vendor access, no session monitoring, vendors located in concerning jurisdictions, inability to quickly revoke access

Note: This is the highest point value question as third-party access is a primary sovereignty risk.

Domain 2: Technical Sovereignty

Domain Overview

Technical Sovereignty evaluates the degree of control an organization maintains over the foundational components of its technology stack—from hardware and firmware to application source code and runtime environments. High maturity signifies deliberate reduction in reliance on proprietary interfaces and single-vendor ecosystems, ensuring the ability to rebuild or migrate critical functions if necessary.

Key Focus Areas: Technology stack ownership, vendor lock-in mitigation, standardized frameworks, interoperability, hardware provenance, self-hosted runtimes, IP control, future-proofing

Common Discussion Topics: Open source adoption, Kubernetes and containerization, multi-cloud strategies, escrow agreements, supply chain security

Domain 3: Operational Sovereignty

Domain Overview

This domain examines the organization's autonomy and independence in executing critical business and IT operations. It ensures that essential functions can be performed without reliance on external human expertise or infrastructure outside the organization's direct control or trusted sovereign borders.

Key Focus Areas: Process documentation, managed service dependencies, IAM, internal skills, disaster recovery, supply chain vetting, incident response, operational autonomy

Common Discussion Topics: Break-glass procedures, in-house vs. outsourced operations, business continuity planning, geopolitical isolation scenarios

Domain 4: Assurance Sovereignty

Domain Overview

Assurance Sovereignty addresses the right, capability, and transparency required to verify the security and compliance claims of both internal systems and external vendors. It's the mechanism by which trust is verified, not assumed, through independent audits and continuous technical validation.

Key Focus Areas: Audit rights, sovereign SIEM, compliance verification, transparency requirements, sovereign certifications, continuous monitoring, security testing, vulnerability management

Common Discussion Topics: Right to audit clauses, SOC 2 Type II, penetration testing, third-party attestations, domestic vs. foreign auditors

Domain 5: Open Source

Domain Overview

This domain assesses the organization's strategic use of open-source software to reduce proprietary dependencies, increase transparency, and build internal capabilities. Mature organizations actively contribute to and influence open-source projects critical to their sovereignty goals.

Key Focus Areas: Open source strategy, community participation, license compliance, vulnerability management, sovereign distributions, contribution policies, internal expertise, project governance

Common Discussion Topics: Red Hat Enterprise Linux, Kubernetes, Apache projects, InnerSource, security scanning, open source vs. commercial support

Domain 6: Executive Oversight

Domain Overview

Executive Oversight ensures that sovereignty concerns are understood, prioritized, and actively managed at the highest levels of the organization. This domain measures board and C-suite engagement, dedicated budgets, governance structures, and accountability for sovereignty outcomes.

Key Focus Areas: Board awareness, dedicated governance, budget allocation, sovereignty policies, risk management, accountability, strategic planning, regulatory engagement

Common Discussion Topics: Board reporting, sovereignty champions, dedicated budgets vs. embedded costs, KPIs and metrics, regulatory relationships

Domain 7: Managed Services

Domain Overview

This domain evaluates how the organization manages relationships with external managed service providers while maintaining sovereignty. It addresses vendor selection criteria, contractual controls, geographic restrictions, transition planning, and the balance between operational efficiency and sovereign control.

Key Focus Areas: Vendor selection criteria, contractual controls, geographic restrictions, data access limitations, performance monitoring, transition planning, alternatives evaluation, insourcing capabilities

Common Discussion Topics: Domestic vs. foreign MSPs, data center locations, support personnel jurisdictions, exit strategies, dual-source strategies

Post-Assessment Activities

Results Interpretation

After completing the assessment, guide the customer through understanding their results.

Understanding the Spider Chart

The spider/radar chart provides a visual representation of maturity across all domains:

Typical Results

Most organizations on their first assessment score:

  • Overall average: 30-45% (Managed to Defined levels)
  • Strong domains: Often Executive Oversight, basic Data Protection
  • Weak domains: Often Cryptographic Key Management, Workload Protection, Operational Autonomy

Reassure customers that these results are normal starting points, not failures.

Score Discussion Points

Gap Analysis and Prioritization

Work with the customer to translate scores into actionable priorities:

Prioritization Framework

Priority Criteria Example
Critical (0-3 months) Regulatory requirement, high industry weighting, Level 1 on high-point questions Implementing HSM-based key management for healthcare patient data
High (3-6 months) Significant sovereignty risk, medium weighting, Level 2 on strategic questions Establishing sovereign audit rights with cloud providers
Medium (6-12 months) Important capability, standard weighting, Level 2-3 on foundation questions Implementing data classification and discovery tools
Low (12+ months) Optimization, already at Level 3+, advanced questions Establishing open source contribution programs

Recommended Roadmap Structure

Phase 1: Foundation (0-6 months)

Policy development, basic controls, compliance alignment, data inventory, vendor assessment

Phase 2: Strategic (6-18 months)

Technical implementations, key management, vendor migrations, skills development, tooling deployment

Phase 3: Advanced (18-36 months)

Continuous monitoring, optimization, innovation, industry leadership, operational autonomy

Exporting and Sharing Results

Help customers export and distribute results appropriately:

Next Steps and Follow-up

Schedule follow-up activities to maintain momentum:

Recommended Follow-up Schedule

  • Week 1: Send detailed results summary and initial recommendations
  • Week 2-3: Schedule roadmap planning workshop (2 hours)
  • Month 2: Check-in on quick wins and foundation initiatives
  • Quarter 2: Progress review and assessment update for changed answers
  • Annual: Full reassessment to measure improvement

Engagement Opportunities

The assessment often reveals opportunities for further engagement:

Facilitator Tips & Best Practices

Do's

Don'ts

Remote Facilitation Tips

When conducting assessments remotely:

Dealing with Challenging Personalities

The Over-Confident Executive

Behavior: Claims high maturity without evidence, dismisses concerns, believes "we have the best security"

Approach: Acknowledge their confidence, then request specific evidence. Use data and industry benchmarks. Ask their technical team to verify claims. Frame lower scores as "industry-standard journey" rather than failures.

The "Too Busy" Participant

Behavior: Late to session, distracted, checking phone, wants to rush through

Approach: Respectfully emphasize the value of their time investment. Show early results to demonstrate value. Offer to reschedule if they can't focus. Break into shorter sessions if needed.

The Defensive CISO

Behavior: Takes low scores personally, explains why gaps aren't their fault, blames budget/management

Approach: Emphasize this is organizational assessment, not personal evaluation. Validate resource constraints. Position results as ammunition for budget requests. Frame gaps as opportunities to demonstrate need for investment.

The Technical Perfectionist

Behavior: Debates every nuance, wants to discuss technical details extensively, struggles to choose between maturity levels

Approach: Appreciate their thoroughness. Set time limits for each question. Offer to deep-dive on specific topics afterward. Remind that perfect accuracy is less important than directional understanding. Use "parking lot" for detailed technical discussions.

Appendix

Downloadable Templates

Ready-to-use templates are available to support your assessment delivery:

Access All Templates

Visit the Templates Library for:

  • Full-Day Workshop Agenda: Comprehensive one-day format with detailed schedule
  • Short Assessment Agenda: 2-hour focused assessment format
  • Email Templates: Pre-written emails for invitation, preparation, follow-up, and check-ins
  • Executive Summary Template: One-page results summary for C-suite presentation

Glossary of Key Terms

Term Definition
BYOK Bring Your Own Key - Customer-generated encryption keys imported to cloud provider
CLOUD Act US law allowing government access to data held by US companies regardless of location
Confidential Computing Protection of data during processing using hardware-based secure enclaves
Data Residency Physical location where data is stored
Data Sovereignty Legal and technical control over data, including ability to resist foreign access demands
DLP Data Loss Prevention - Tools to monitor and prevent unauthorized data transfers
EKM External Key Management - Encryption keys managed outside cloud provider infrastructure
HSM Hardware Security Module - Dedicated cryptographic processor for key management
PAM Privileged Access Management - System for controlling and monitoring administrative access
SCA Software Composition Analysis - Scanning third-party code for vulnerabilities
TEE Trusted Execution Environment - Secure area of processor for sensitive operations
Zero Trust Security model assuming no implicit trust, requiring verification for all access

Reference Materials

Sample Email Templates

Pre-Assessment Email

Subject: Preparation for Digital Sovereignty Maturity Assessment - [Date]

Dear [Stakeholders],

Thank you for scheduling a Full Maturity Assessment. This session will evaluate your organization's Digital Sovereignty capabilities across 7 key domains using a proven 5-level maturity framework.

Session Details:
Date/Time: [Date/Time]
Location/Link: [Details]

Required Participants: CIO/CTO, CISO, Cloud/Infrastructure Lead, Compliance Officer

Please prepare:

  • List of cloud providers and services used
  • Current compliance frameworks and certifications
  • Data classification and residency policies
  • Key vendor relationships and contracts

Looking forward to our session.

Best regards,
[Your Name]

Post-Assessment Email

Subject: Digital Sovereignty Assessment Results and Next Steps

Dear [Stakeholders],

Thank you for participating in yesterday's maturity assessment. Your engagement and candor were excellent.

Key Findings:

  • Overall maturity: [X]% ([Maturity Level])
  • Strongest domain: [Domain] at [Y]%
  • Priority gap: [Domain] at [Z]%

Attached you'll find:

  • Detailed results export
  • Spider chart visualization
  • Initial recommendations summary

Recommended Next Steps:

  1. Review results with your teams (Week 1)
  2. Roadmap planning workshop (Week 2-3)
  3. Prioritize quick wins for immediate action

I'll follow up next week to schedule our roadmap session.

Best regards,
[Your Name]

Quick Reference: Maturity Level Indicators

Level Key Indicators Common Language
1 No policy, ad-hoc, reactive, "we're planning to" "We know we need to do this"
2 Draft policies, pilots, project plans, some implementation "We're working on it"
3 Approved policies, widespread deployment, documented standards "We have this in place"
4 Metrics, dashboards, KPIs, regular reporting, measured outcomes "We measure and optimize this"
5 Continuous improvement, innovation, industry leadership "We're leading the industry"