Skip to content

added log_example_interval#1522

Open
agolajko wants to merge 4 commits intoNovaSky-AI:mainfrom
agolajko:logging
Open

added log_example_interval#1522
agolajko wants to merge 4 commits intoNovaSky-AI:mainfrom
agolajko:logging

Conversation

@agolajko
Copy link
Copy Markdown
Contributor

@agolajko agolajko commented Apr 15, 2026

Re: Configuration flag(s) to disable log_example in training/eval loops #1497

New log_example_interval config field controls how frequently train logs the prompt, response and reward.

Train:

  • controls on/off and frequency

Evaluate

  • only controls on/off since eval_interval already controls eval frequency

Open with Devin

gemini-code-assist[bot]

This comment was marked as resolved.

Copy link
Copy Markdown
Contributor

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Devin Review: No Issues Found

Devin Review analyzed this PR and found no potential bugs to report.

View in Devin Review to see 2 additional findings.

Open in Devin Review

devin-ai-integration[bot]

This comment was marked as resolved.

Copy link
Copy Markdown

@jamesbraza jamesbraza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, few comments. Note I am unassociated with SkyRL team

rope_scaling: Optional[Dict[str, Any]] = None
rope_theta: Optional[float] = None
log_example_interval: int = 1
"""Log an example prompt every N training steps, ``-1`` to disable"""
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we document the implication of setting to 0? It seems 0 also silently disables it right now

dump_eval_results: bool = True
rope_scaling: Optional[Dict[str, Any]] = None
rope_theta: Optional[float] = None
log_example_interval: int = 1
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hugging Face has a logging_first_step: bool that let's you control if logs are emitted when global step is 0, what do you think of adding that one too?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Logs here are already emitted when global_step is 0 since we take global_step % log_interval == 0 so you see the first logs in every scenario unless it's turned off, so I'd probs not add that new config field

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants