Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(batch-jobs): Add step to pass API server env vars to spark jobs #400

Conversation

deadlycoconuts
Copy link
Contributor

Context

Similar to caraml-dev/merlin#624, this simple PR simply adds a tiny feature to allow the API server to pass certain pre-configured environment variable values from its environment to the Spark drivers and executors (for batch prediction) that it spins up.

Modifications

  • api/turing/cluster/spark.go - Addition of a step to add pre-configured API server environment variables to the Spark driver/executor manifest
  • api/turing/config/config.go - Addition of a new config field to specify the environment values mentioned above

@deadlycoconuts deadlycoconuts added the enhancement New feature or request label Jan 6, 2025
@deadlycoconuts deadlycoconuts self-assigned this Jan 6, 2025
@deadlycoconuts deadlycoconuts marked this pull request as ready for review January 6, 2025 06:36
Copy link
Contributor

@vinoth-gojek vinoth-gojek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@deadlycoconuts
Copy link
Contributor Author

Thanks a lot for the quick review! Merging this now! :D

@deadlycoconuts deadlycoconuts merged commit 26ea22a into caraml-dev:main Jan 7, 2025
12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants