Files
claude-skills-reference/c-level-advisor/cto-advisor/references/technology_evaluation_framework.md
Reza Rezvani 619f7be887 feat: add C-level advisor skills (CEO & CTO) and packaged skill archives
Add two executive leadership skill packages:

CEO Advisor:
- Strategy analyzer and financial scenario analyzer (Python tools)
- Executive decision framework
- Leadership & organizational culture guidelines
- Board governance & investor relations guidance

CTO Advisor:
- Tech debt analyzer and team scaling calculator (Python tools)
- Engineering metrics framework
- Technology evaluation framework
- Architecture decision records templates

Also includes packaged .zip archives for easy distribution.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-19 06:06:54 +02:00

9.6 KiB

Technology Evaluation Framework

Evaluation Process

Phase 1: Requirements Gathering (Week 1)

Functional Requirements

  • Core features needed
  • Integration requirements
  • Performance requirements
  • Scalability needs
  • Security requirements

Non-Functional Requirements

  • Usability/Developer experience
  • Documentation quality
  • Community support
  • Vendor stability
  • Compliance needs

Constraints

  • Budget limitations
  • Timeline constraints
  • Team expertise
  • Existing technology stack
  • Regulatory requirements

Phase 2: Market Research (Week 1-2)

Identify Candidates

  1. Industry leaders (Gartner Magic Quadrant)
  2. Open-source alternatives
  3. Emerging solutions
  4. Build vs Buy analysis

Initial Filtering

  • Eliminate options not meeting hard requirements
  • Remove options outside budget
  • Focus on 3-5 top candidates

Phase 3: Deep Evaluation (Week 2-4)

Technical Evaluation

  • Proof of Concept (PoC)
  • Performance benchmarks
  • Security assessment
  • Integration testing
  • Scalability testing

Business Evaluation

  • Total Cost of Ownership (TCO)
  • Return on Investment (ROI)
  • Vendor assessment
  • Risk analysis
  • Exit strategy

Phase 4: Decision (Week 4)

Evaluation Criteria Matrix

Technical Criteria (40%)

Criterion Weight Description Scoring Guide
Performance 10% Speed, throughput, latency 5: Exceeds requirements
3: Meets requirements
1: Below requirements
Scalability 10% Ability to grow with needs 5: Linear scalability
3: Some limitations
1: Hard limits
Reliability 8% Uptime, fault tolerance 5: 99.99% SLA
3: 99.9% SLA
1: <99% SLA
Security 8% Security features, compliance 5: Exceeds standards
3: Meets standards
1: Concerns exist
Integration 4% API quality, compatibility 5: Native integration
3: Good APIs
1: Limited integration

Business Criteria (30%)

Criterion Weight Description Scoring Guide
Cost 10% TCO including licenses, operation 5: Under budget by >20%
3: Within budget
1: Over budget
ROI 8% Value generation potential 5: <6 month payback
3: <12 month payback
1: >24 month payback
Vendor Stability 6% Financial health, market position 5: Market leader
3: Established player
1: Startup/uncertain
Support Quality 6% Support availability, SLAs 5: 24/7 premium support
3: Business hours
1: Community only

Operational Criteria (30%)

Criterion Weight Description Scoring Guide
Ease of Use 8% Learning curve, UX 5: Intuitive
3: Moderate learning
1: Steep curve
Documentation 7% Quality, completeness 5: Excellent docs
3: Adequate docs
1: Poor docs
Community 7% Size, activity, resources 5: Large, active
3: Moderate
1: Small/inactive
Maintenance 8% Operational overhead 5: Fully managed
3: Some maintenance
1: High maintenance

Vendor Evaluation Template

Vendor Profile

  • Company Name:
  • Founded:
  • Headquarters:
  • Employees:
  • Revenue:
  • Funding (if applicable):
  • Key Customers:

Product Assessment

Strengths

  • Market leader position
  • Strong feature set
  • Good performance
  • Excellent support
  • Active development

Weaknesses

  • Price point
  • Learning curve
  • Limited customization
  • Vendor lock-in
  • Missing features

Opportunities

  • Roadmap alignment
  • Partnership potential
  • Training availability
  • Professional services

Threats

  • Competitive alternatives
  • Market changes
  • Technology shifts
  • Acquisition risk

Financial Analysis

Cost Breakdown

Component Year 1 Year 2 Year 3 Total
Licensing $ $ $ $
Implementation $ $ $ $
Training $ $ $ $
Support $ $ $ $
Infrastructure $ $ $ $
Total $ $ $ $

ROI Calculation

  • Cost Savings:
    • Reduced manual work: $/year
    • Efficiency gains: $/year
    • Error reduction: $/year
  • Revenue Impact:
    • New capabilities: $/year
    • Faster time to market: $/year
  • Payback Period: X months

Risk Assessment

Risk Probability Impact Mitigation
Vendor goes out of business Low/Med/High Low/Med/High Strategy
Technology becomes obsolete
Integration difficulties
Team adoption challenges
Budget overrun
Performance issues

Build vs Buy Decision Framework

When to Build

Advantages:

  • Full control over features
  • No vendor lock-in
  • Potential competitive advantage
  • Perfect fit for requirements
  • No licensing costs

Build when:

  • Core business differentiator
  • Unique requirements
  • Long-term investment
  • Have expertise in-house
  • No suitable solutions exist

Hidden Costs:

  • Development time
  • Maintenance burden
  • Security responsibility
  • Documentation needs
  • Training requirements

When to Buy

Advantages:

  • Faster time to market
  • Proven solution
  • Vendor support
  • Regular updates
  • Shared development costs

Buy when:

  • Commodity functionality
  • Standard requirements
  • Limited internal resources
  • Need quick solution
  • Good options available

Hidden Costs:

  • Customization limits
  • Vendor lock-in
  • Integration effort
  • Training needs
  • Scaling costs

When to Adopt Open Source

Advantages:

  • No licensing costs
  • Community support
  • Transparency
  • Customizable
  • No vendor lock-in

Adopt when:

  • Strong community exists
  • Standard solution needed
  • Have technical expertise
  • Can contribute back
  • Long-term stability needed

Hidden Costs:

  • Support costs
  • Security responsibility
  • Upgrade management
  • Integration effort
  • Potential consulting needs

Proof of Concept Guidelines

PoC Scope

  1. Duration: 2-4 weeks
  2. Team: 2-3 engineers
  3. Environment: Isolated/sandbox
  4. Data: Representative sample

Success Criteria

  • Core use cases demonstrated
  • Performance benchmarks met
  • Integration points tested
  • Security requirements validated
  • Team feedback positive

PoC Checklist

  • Environment setup documented
  • Test scenarios defined
  • Metrics collection automated
  • Team training completed
  • Results documented

PoC Report Template

# PoC Report: [Technology Name]

## Executive Summary
- **Recommendation**: [Proceed/Stop/Investigate Further]
- **Confidence Level**: [High/Medium/Low]
- **Key Finding**: [One sentence summary]

## Test Results

### Functional Tests
| Test Case | Result | Notes |
|-----------|--------|-------|
| | Pass/Fail | |

### Performance Tests
| Metric | Target | Actual | Status |
|--------|--------|--------|---------|
| Response Time | <100ms | Xms | ✓/✗ |
| Throughput | >1000 req/s | X req/s | ✓/✗ |
| CPU Usage | <70% | X% | ✓/✗ |
| Memory Usage | <4GB | XGB | ✓/✗ |

### Integration Tests
| System | Status | Effort |
|--------|--------|--------|
| Database | ✓/✗ | Low/Med/High |
| API Gateway | ✓/✗ | Low/Med/High |
| Authentication | ✓/✗ | Low/Med/High |

## Team Feedback
- **Ease of Use**: [1-5 rating]
- **Documentation**: [1-5 rating]
- **Would Recommend**: [Yes/No]

## Risks Identified
1. [Risk and mitigation]
2. [Risk and mitigation]

## Next Steps
1. [Action item]
2. [Action item]

Technology Categories

Development Platforms

  • Languages: TypeScript, Python, Go, Rust, Java
  • Frameworks: React, Node.js, Spring, Django, FastAPI
  • Mobile: React Native, Flutter, Swift, Kotlin
  • Evaluation Focus: Developer productivity, ecosystem, performance

Databases

  • SQL: PostgreSQL, MySQL, SQL Server
  • NoSQL: MongoDB, Cassandra, DynamoDB
  • NewSQL: CockroachDB, Vitess, TiDB
  • Evaluation Focus: Performance, scalability, consistency, operations

Infrastructure

  • Cloud: AWS, GCP, Azure
  • Containers: Docker, Kubernetes, Nomad
  • Serverless: Lambda, Cloud Functions, Vercel
  • Evaluation Focus: Cost, scalability, vendor lock-in, operations

Monitoring & Observability

  • APM: DataDog, New Relic, AppDynamics
  • Logging: ELK Stack, Splunk, CloudWatch
  • Metrics: Prometheus, Grafana, CloudWatch
  • Evaluation Focus: Coverage, cost, integration, insights

Security

  • SAST: Sonarqube, Checkmarx, Veracode
  • DAST: OWASP ZAP, Burp Suite
  • Secrets: Vault, AWS Secrets Manager
  • Evaluation Focus: Coverage, false positives, integration

DevOps Tools

  • CI/CD: Jenkins, GitLab CI, GitHub Actions
  • IaC: Terraform, CloudFormation, Pulumi
  • Configuration: Ansible, Chef, Puppet
  • Evaluation Focus: Flexibility, integration, learning curve

Continuous Evaluation

Quarterly Reviews

  • Technology landscape changes
  • Performance against expectations
  • Cost optimization opportunities
  • Team satisfaction
  • Market alternatives

Annual Assessment

  • Full technology stack review
  • Vendor relationship evaluation
  • Strategic alignment check
  • Technical debt assessment
  • Roadmap planning

Deprecation Planning

  • Migration strategy
  • Timeline definition
  • Risk assessment
  • Communication plan
  • Success metrics

Decision Documentation

Always document:

  1. Why the technology was chosen
  2. Who was involved in the decision
  3. When the decision was made
  4. What alternatives were considered
  5. How success will be measured

Use Architecture Decision Records (ADRs) for significant technology choices.