Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding a test use case for creating #33860

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

utieyin
Copy link
Contributor

@utieyin utieyin commented Nov 11, 2024

Fixes:

Related:

Pre-review Checklist

For new package PRs only

  • This PR is marked as fixing a pre-existing package request bug
    • Alternatively, the PR is marked as related to a pre-existing package request bug, such as a dependency
  • REQUIRED - The package is available under an OSI-approved or FSF-approved license
  • REQUIRED - The version of the package is still receiving security updates
  • This PR links to the upstream project's support policy (e.g. endoflife.date)

For new version streams

  • The upstream project actually supports multiple concurrent versions.
  • Any subpackages include the version string in their package name (e.g. name: ${{package.name}}-compat)
  • The package (and subpackages) provides: logical unversioned forms of the package (e.g. nodejs, nodejs-lts)
  • If non-streamed package names no longer built, open PR to withdraw them (see WITHDRAWING PACKAGES)

For package updates (renames) in the base images

When updating packages part of base images (i.e. cgr.dev/chainguard/wolfi-base or ghcr.io/wolfi-dev/sdk)

  • REQUIRED cgr.dev/chainguard/wolfi-base and ghcr.io/wolfi-dev/sdk images successfully build
  • REQUIRED cgr.dev/chainguard/wolfi-base and ghcr.io/wolfi-dev/sdk contain no obsolete (no longer built) packages
  • Upon launch, does apk upgrade --latest successfully upgrades packages or performs no actions

For security-related PRs

  • The security fix is recorded in the advisories repo

For version bump PRs

  • The epoch field is reset to 0

For PRs that add patches

  • Patch source is documented

test-spark.yaml Outdated Show resolved Hide resolved
test-spark.yaml Outdated Show resolved Hide resolved
test-spark.yaml Outdated Show resolved Hide resolved
test-spark.yaml Outdated Show resolved Hide resolved
Copy link
Member

@EyeCantCU EyeCantCU left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great effort. It looks like we're recreating packages that contain the same logic. Ideally, we'd want to use a range with opt-in logic when the version is 2.13. Appears as though this breaks the package for OpenJDK 8 as well

SPARK_USER: tester
HADOOP_USER_NAME: tester
SPARK_HOME: /usr/lib/spark
PATH: $PATH:${SPARK_HOME}/bin
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can't extend the environment here. This is no-op. Move to an export

- bash
- wolfi-base
- libstdc++-11
- libavcodec61
environment:
LANG: en_US.UTF-8
JAVA_HOME: /usr/lib/jvm/${{range.value}}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is also no-op, has to be done in the pipeline

Comment on lines +99 to +102
- bash
- wolfi-base
- libstdc++-11
- libavcodec61
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need these now?

@@ -21,7 +21,7 @@ environment:
- maven
- openjdk-11
- openjdk-17
# Only 8 is used during the build process
- openjdk-17-default-jvm
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since we explicitly set JAVA_HOME, we don't need this. Also error prone as we build with two different Java versions and don't want to use the wrong one

@@ -120,7 +131,7 @@ subpackages:
val sum = rdd.reduce(_ + _)
assert(sum == 15)
EOF
cat SimpleJob.scala | /usr/lib/spark/bin/spark-shell
cat SimpleJob.scala | /usr/lib/spark/bin/spark-shell --conf spark.jars.ivy=/tmp/.ivy
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure I understand why we need pass --conf

Comment on lines +249 to +251
dependencies:
runtime:
- openjdk-17-default-jvm
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We build with 8 and 17. We should drop this

Comment on lines +263 to +267
mkdir -p ${{targets.contextdir}}/usr/lib/spark/work-dir
mv bin/ ${{targets.contextdir}}/usr/lib/spark
mv sbin/ ${{targets.contextdir}}/usr/lib/spark
mv target ${{targets.contextdir}}/usr/lib/spark
mv assembly ${{targets.contextdir}}/usr/lib/spark
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does this deviate from the build process for 2.12 above?

mv assembly ${{targets.contextdir}}/usr/lib/spark
cp resource-managers/kubernetes/docker/src/main/dockerfiles/spark/entrypoint.sh ${{targets.contextdir}}/usr/lib/spark/
- name: ${{package.name}}-scala-2.13-compat
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need another compat package?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@EyeCantCU The entrypoint.sh is sourced from a different location for 2.12 which I suspect is as a result of the build process.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we loop over a range of supported configurations (scala 2.12-2.13, jdk8-17), we can accommodate that in the same subpackage definition without creating a new one with much of the same logic

SPARK_LOCAL_IP: 127.0.0.1
SPARK_LOCAL_HOSTNAME: localhost
SPARK_HOME: /usr/lib/spark
PATH: $PATH:${SPARK_HOME}/bin
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't work, use an export

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants