# Running Notebooks in Background ## Quick: Check Ray Tune Progress **Current run:** PaddleOCR hyperparameter optimization via Ray Tune + Optuna. - 64 trials searching for optimal detection/recognition thresholds - 2 CPU workers running in parallel (Docker containers on ports 8001-8002) - Notebook: `paddle_ocr_raytune_rest.ipynb` → `output_raytune.ipynb` - Results saved to: `~/ray_results/trainable_paddle_ocr_2026-01-18_17-25-43/` ```bash # Is it still running? ps aux | grep papermill | grep -v grep # View live log tail -f papermill.log # Count completed trials (64 total) find ~/ray_results/trainable_paddle_ocr_2026-01-18_17-25-43/ -name "result.json" ! -empty | wc -l # Check workers are healthy curl -s localhost:8001/health | jq -r '.status' curl -s localhost:8002/health | jq -r '.status' ``` --- ## Option 1: Papermill (Recommended) Runs notebooks directly without conversion. ```bash pip install papermill nohup papermill .ipynb output.ipynb > papermill.log 2>&1 & ``` Monitor: ```bash tail -f papermill.log ``` ## Option 2: Convert to Python Script ```bash jupyter nbconvert --to script .ipynb nohup python .py > output.log 2>&1 & ``` **Note:** `%pip install` magic commands need manual removal before running as `.py` ## Important Notes - Ray Tune notebooks require the OCR service running first (Docker) - For Ray workers, imports must be inside trainable functions ## Example: Ray Tune PaddleOCR ```bash # 1. Start OCR service cd src/paddle_ocr && docker compose up -d ocr-cpu # 2. Run notebook with papermill cd src nohup papermill paddle_ocr_raytune_rest.ipynb output_raytune.ipynb > papermill.log 2>&1 & # 3. Monitor tail -f papermill.log ```