-
Notifications
You must be signed in to change notification settings - Fork 253
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding a test use case for creating #33860
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great effort. It looks like we're recreating packages that contain the same logic. Ideally, we'd want to use a range with opt-in logic when the version is 2.13. Appears as though this breaks the package for OpenJDK 8 as well
SPARK_USER: tester | ||
HADOOP_USER_NAME: tester | ||
SPARK_HOME: /usr/lib/spark | ||
PATH: $PATH:${SPARK_HOME}/bin |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can't extend the environment here. This is no-op. Move to an export
- bash | ||
- wolfi-base | ||
- libstdc++-11 | ||
- libavcodec61 | ||
environment: | ||
LANG: en_US.UTF-8 | ||
JAVA_HOME: /usr/lib/jvm/${{range.value}} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is also no-op, has to be done in the pipeline
- bash | ||
- wolfi-base | ||
- libstdc++-11 | ||
- libavcodec61 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need these now?
@@ -21,7 +21,7 @@ environment: | |||
- maven | |||
- openjdk-11 | |||
- openjdk-17 | |||
# Only 8 is used during the build process | |||
- openjdk-17-default-jvm |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since we explicitly set JAVA_HOME, we don't need this. Also error prone as we build with two different Java versions and don't want to use the wrong one
@@ -120,7 +131,7 @@ subpackages: | |||
val sum = rdd.reduce(_ + _) | |||
assert(sum == 15) | |||
EOF | |||
cat SimpleJob.scala | /usr/lib/spark/bin/spark-shell | |||
cat SimpleJob.scala | /usr/lib/spark/bin/spark-shell --conf spark.jars.ivy=/tmp/.ivy |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure I understand why we need pass --conf
dependencies: | ||
runtime: | ||
- openjdk-17-default-jvm |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We build with 8 and 17. We should drop this
mkdir -p ${{targets.contextdir}}/usr/lib/spark/work-dir | ||
mv bin/ ${{targets.contextdir}}/usr/lib/spark | ||
mv sbin/ ${{targets.contextdir}}/usr/lib/spark | ||
mv target ${{targets.contextdir}}/usr/lib/spark | ||
mv assembly ${{targets.contextdir}}/usr/lib/spark |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why does this deviate from the build process for 2.12 above?
mv assembly ${{targets.contextdir}}/usr/lib/spark | ||
cp resource-managers/kubernetes/docker/src/main/dockerfiles/spark/entrypoint.sh ${{targets.contextdir}}/usr/lib/spark/ | ||
- name: ${{package.name}}-scala-2.13-compat |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need another compat package?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EyeCantCU The entrypoint.sh
is sourced from a different location for 2.12
which I suspect is as a result of the build process.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we loop over a range of supported configurations (scala 2.12-2.13, jdk8-17), we can accommodate that in the same subpackage definition without creating a new one with much of the same logic
SPARK_LOCAL_IP: 127.0.0.1 | ||
SPARK_LOCAL_HOSTNAME: localhost | ||
SPARK_HOME: /usr/lib/spark | ||
PATH: $PATH:${SPARK_HOME}/bin |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't work, use an export
Fixes:
Related:
Pre-review Checklist
For new package PRs only
endoflife.date
)For new version streams
name: ${{package.name}}-compat
)provides:
logical unversioned forms of the package (e.g.nodejs
,nodejs-lts
)For package updates (renames) in the base images
When updating packages part of base images (i.e. cgr.dev/chainguard/wolfi-base or ghcr.io/wolfi-dev/sdk)
apk upgrade --latest
successfully upgrades packages or performs no actionsFor security-related PRs
For version bump PRs
epoch
field is reset to 0For PRs that add patches