Developing Custom Fine-Tuning Templates
TOC
Template Structure Overview
A custom fine-tuning template should include the essential configuration files and training scripts. For example, the YOLOv5 object detection fine-tuning template (finetune-object-detection) typically has the following directory structure:
- Core training script: Handles the model training logic.
- Utility scripts: Provide helper functions to interact with the platform.
- Configuration files: Specify the training environment and parameters.
Core Responsibilities and Script Requirements
Your main responsibility is to implement a custom fine-tuning training script (usually named run.sh). To ensure your script integrates smoothly with Alauda AI platform sub-tasks, follow these three key requirements:
1. Import Platform Utility Scripts
At the beginning of your main training script (e.g., run.sh), include the following commands to load platform-provided utility functions:
Purpose: The util.sh script provides standard platform functions such as parameter retrieval, path resolution, and logging. Refer to the provided examples to ensure your script uses the built-in parameters and control flow correctly.
2. Model Output Path Notification
Before the training function exits, you must execute the following command to pass the output path of the fine-tuned model to subsequent tasks (such as model upload):
Purpose: This mechanism allows the platform to identify and collect the final training outputs. Ensure the path is constructed correctly (base model path + relative output directory).
3. Script Execution Permissions
Before uploading your fine-tuning template to the GitLab model repository, make sure all Bash script files (especially run.sh and any dependent .sh files) have executable permissions.
Action: Set the permissions by running chmod +x *.sh or by specifying individual files.
Key Parameter Reference Table
When implementing your fine-tuning template, review the table below to understand the core parameters in the template directory and scripts, along with their meanings. These parameters define how the base model, dataset, and platform environment are connected. Recommendation: Before writing your own template, study the official sample templates to understand how parameters are used in real training workflows.