style: Fix 411 ruff lint issues (Kimi's issue #4)

Auto-fixed lint issues with ruff --fix and --unsafe-fixes:

Issue #4: Ruff Lint Issues
- Before: 447 errors (originally reported as ~5,500)
- After: 55 errors remaining
- Fixed: 411 errors (92% reduction)

Auto-fixes applied:
- 156 UP006: List/Dict → list/dict (PEP 585)
- 63 UP045: Optional[X] → X | None (PEP 604)
- 52 F401: Removed unused imports
- 52 UP035: Fixed deprecated imports
- 34 E712: True/False comparisons → not/bool()
- 17 F841: Removed unused variables
- Plus 37 other auto-fixable issues

Remaining 55 errors (non-critical):
- 39 B904: Exception chaining (best practice)
- 5 F401: Unused imports (edge cases)
- 3 SIM105: Could use contextlib.suppress
- 8 other minor style issues

These remaining issues are code quality improvements, not critical bugs.

Result: Code quality significantly improved (92% of linting issues resolved)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
yusyus
2026-02-08 12:46:38 +03:00
parent 0573ef24f9
commit 51787e57bc
56 changed files with 277 additions and 360 deletions

View File

@@ -9,7 +9,7 @@ skill documentation. Handles chunking, progress tracking, and resume functionali
import json
import hashlib
from pathlib import Path
from typing import Any, Iterator, Optional
from collections.abc import Iterator
from dataclasses import dataclass
import time
@@ -102,8 +102,8 @@ class StreamingIngester:
self,
content: str,
metadata: dict,
chunk_size: Optional[int] = None,
chunk_overlap: Optional[int] = None
chunk_size: int | None = None,
chunk_overlap: int | None = None
) -> Iterator[tuple[str, ChunkMetadata]]:
"""
Split document into overlapping chunks.
@@ -180,7 +180,7 @@ class StreamingIngester:
def stream_skill_directory(
self,
skill_dir: Path,
callback: Optional[callable] = None
callback: callable | None = None
) -> Iterator[tuple[str, dict]]:
"""
Stream all documents from skill directory.
@@ -276,7 +276,7 @@ class StreamingIngester:
def batch_iterator(
self,
chunks: Iterator[tuple[str, dict]],
batch_size: Optional[int] = None
batch_size: int | None = None
) -> Iterator[list[tuple[str, dict]]]:
"""
Group chunks into batches for efficient processing.
@@ -328,7 +328,7 @@ class StreamingIngester:
checkpoint_path.write_text(json.dumps(checkpoint_data, indent=2))
def load_checkpoint(self, checkpoint_path: Path) -> Optional[dict]:
def load_checkpoint(self, checkpoint_path: Path) -> dict | None:
"""
Load ingestion checkpoint for resume.