-
Notifications
You must be signed in to change notification settings - Fork 97
RHOAIENG-28583: Create Runtime Images for Python 3.12 #1333
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Caution Review failedFailed to post review comments. ## Walkthrough
This change introduces full support for Python 3.12 across multiple Jupyter runtime images, including minimal, datascience, pytorch (CUDA and ROCm), and tensorflow (CUDA and ROCm) variants. It adds new Dockerfiles, Pipfiles, Kubernetes manifests, utility scripts, and bootstrappers for these environments, and updates the Makefile to enable building and including these new images as build targets.
## Changes
| Files/Directories | Change Summary |
|-----------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------|
| Makefile | Enabled build targets for Python 3.12 runtime images (minimal, datascience, pytorch, tensorflow, rocm-pytorch, rocm-tensorflow) by uncommenting relevant lines and adding `jupyter/trustyai`. Removed `rocm-jupyter-pytorch` target from 3.12 all-images list. |
| runtimes/minimal/ubi9-python-3.12/ | Added Dockerfile.cpu, Pipfile, requirements.txt, kustomize configs (base/kustomization.yaml, base/pod.yaml), pip.conf, requirements-elyra.txt, and bootstrapper.py utility. |
| runtimes/datascience/ubi9-python-3.12/ | Added Dockerfile.cpu, Pipfile, kustomize configs (base/kustomization.yaml, base/pod.yaml), pip.conf, requirements-elyra.txt, utils/bootstrapper.py, and utility scripts including bootstrapper.py. |
| runtimes/pytorch/ubi9-python-3.12/ | Added Dockerfile.cuda, Pipfile, kustomize configs (base/kustomization.yaml, base/pod.yaml, components/accelerator/*, overlays/accelerator/cuda/*), pip.conf, requirements-elyra.txt, and bootstrapper.py utility. |
| runtimes/rocm-pytorch/ubi9-python-3.12/ | Added Dockerfile.rocm, Pipfile, de-vendor-torch.sh script, kustomize configs (base/kustomization.yaml, base/pod.yaml), pip.conf, requirements-elyra.txt, and bootstrapper.py utility. |
| runtimes/rocm-tensorflow/ubi9-python-3.12/ | Added Dockerfile.rocm, Pipfile, kustomize configs (base/kustomization.yaml, base/pod.yaml), pip.conf, requirements-elyra.txt, and bootstrapper.py utility. |
| runtimes/tensorflow/ubi9-python-3.12/ | Added Dockerfile.cuda, Pipfile, kustomize configs (base/kustomization.yaml, base/pod.yaml), pip.conf, requirements-elyra.txt, and bootstrapper.py utility. |
| manifests/base/ | Updated ImageStream YAML files for datascience, minimal, pytorch, rocm-pytorch, rocm-tensorflow, tensorflow to reflect Python version bump from 3.11 to 3.12 in metadata and image references. |
## Possibly related PRs
- [opendatahub-io/notebooks#1249](https://github.com/opendatahub-io/notebooks/pull/1249): Introduces a new ROCm PyTorch Python 3.12 Jupyter image with Dockerfile, Pipfile, Kubernetes manifests, and Makefile updates to build and push this image, closely related to enabling Python 3.12 ROCm PyTorch runtime images here.
- [opendatahub-io/notebooks#1247](https://github.com/opendatahub-io/notebooks/pull/1247): Modifies the Makefile to selectively include or exclude certain Python 3.12 images and adds related CI workflow and test adjustments, directly related to the Makefile changes enabling Python 3.12 runtime images.
- [opendatahub-io/notebooks#1306](https://github.com/opendatahub-io/notebooks/pull/1306): Adds the `jupyter/trustyai` Python 3.12 image Dockerfile and related resources, complementing this PR’s Makefile inclusion of the `jupyter/trustyai` image.
## Possibly related issues
- opendatahub-io/notebooks#1348: Addresses missing Python 3.12 version check in bootstrapper’s `determine_elyra_requirements` method across Python 3.12 runtime bootstrappers; this PR adds new bootstrappers for Python 3.12 runtimes and likely resolves this issue.
## Suggested reviewers
- dibryant ✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 23
🔭 Outside diff range comments (2)
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/kustomization.yaml (1)
1-10
: Add missing apiVersion and kind to Kustomization overlayThe
kustomization.yaml
in this overlay is missing the required headers, which newer Kustomize versions (v5+) enforce. Other overlays include these fields, so let’s maintain consistency and prevent future parsing failures.• File needing fix:
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/kustomization.yamlSuggested patch:
+apiVersion: kustomize.config.k8s.io/v1beta1 +kind: Kustomization --- resources: - ../../../base components: - ../../../components/accelerator patches: - path: pod-patch.yamlruntimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (1)
8-23
: Harden container securityContext
Running without an explicit securityContext makes the pod fail OpenShift Pod Security admission on clusters with restricted policies (see issue #1212). Add basic non-root settings & disable privilege escalation.spec: containers: - name: runtime image: runtime-workbench + securityContext: + runAsNonRoot: true + allowPrivilegeEscalation: false + seccompProfile: + type: RuntimeDefault command: ["/bin/sh", "-c", "while true ; do date; sleep 1; done;"]
♻️ Duplicate comments (20)
runtimes/rocm-pytorch/ubi9-python-3.12/utils/pip.conf (1)
1-4
: Same hard-coded target directory observation as in pytorch imageSee previous comment; the advice applies here unchanged.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (1)
6-6
: Same grammar/capitalisation issue as aboveApply the identical diff here.
runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt (1)
6-6
: Same grammar/capitalisation issue as aboveApply the identical diff here.
runtimes/tensorflow/ubi9-python-3.12/utils/pip.conf (1)
1-4
: Same PYTHONPATH-presence concern as raised in rocm-tensorflow pip.confruntimes/minimal/ubi9-python-3.12/utils/pip.conf (1)
1-4
: Same PYTHONPATH-presence concern as raised in rocm-tensorflow pip.confruntimes/datascience/ubi9-python-3.12/utils/pip.conf (1)
1-4
: Same PYTHONPATH-presence concern as raised in rocm-tensorflow pip.confruntimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Replicate the grammar fix here for consistencySame comment string appears; apply the “If details are needed …” wording to keep the placeholder files identical across runtimes.
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (1)
8-22
: Same security hardening as datascience pod is requiredThe ROCm-TensorFlow pod has the identical root-container issue; apply the
securityContext
block suggested in the datascience review.runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml (1)
8-22
: Same security hardening as datascience pod is requiredThe minimal runtime pod repeats the same pattern; please add the
securityContext
block.runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (1)
17-24
: Same non-rootoc
installation issue as CUDA image
Thetar
extraction happens whileUSER 1001
and will likely fail. Apply the root-switch fix as shown in the PyTorch CUDA Dockerfile comment.runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu (3)
4-4
: Base image version pinning is tracked in issue #1242The use of
:latest
tag is being addressed separately.Reference: issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags"
12-12
: Architecture detection improvement is tracked in issue #1332The use of
$(uname -m)
for multi-architecture builds is being addressed separately.Reference: issue #1332 for replacing
$(uname -m)
with${TARGETARCH}
28-31
: Binary download security is tracked in issue #1241The lack of checksum verification for the oc client download is being addressed separately.
Reference: issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images"
runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm (3)
50-50
: Hardcoded x86_64 architecture breaks multi-arch supportSame issue as in the ROCm TensorFlow Dockerfile.
47-47
: Security: GPG verification is disabled for repositoriesSame security concern as in the ROCm TensorFlow Dockerfile.
Also applies to: 52-52
82-82
: Fix typo in commentSame typo as in previous Dockerfiles.
runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda (1)
168-168
: Fix typo in commentSame typo as in previous Dockerfiles.
runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py (1)
1-769
: Same issues apply to this identical bootstrapper file.This file is identical to
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py
and has the same issues:
- Line 170: Typo "bootsrapper" → "bootstrapper"
- Lines 619-621: Missing Python 3.12 in version check
- Type hints should be modernized for Python 3.12
- Missing newline at end of file
runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py (1)
1-769
: Same issues apply to this identical bootstrapper file.This file is identical to
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py
and has the same issues:
- Line 170: Typo "bootsrapper" → "bootstrapper"
- Lines 619-621: Missing Python 3.12 in version check
- Type hints should be modernized for Python 3.12
- Missing newline at end of file
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py (1)
1-769
: Identical file with same issuesThis file is identical to
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py
. All the review comments from that file apply here as well, particularly:
- Critical: Add Python 3.12 support in
determine_elyra_requirements()
method (line 620)- Modernize type annotations for Python 3.12
- Security improvements for subprocess usage
- Code quality improvements
Consider extracting this bootstrapper to a shared location to avoid duplication across runtime environments.
🧹 Nitpick comments (31)
runtimes/pytorch/ubi9-python-3.12/utils/pip.conf (1)
1-4
: Consider parameterising the pip target directoryHard-coding
/opt/app-root/src/jupyter-work-dir/python3/
in every image means any future path change requires touching many files. One option is to keep a singlePIP_TARGET
ENV in the Dockerfile and reference it here (pip supports env-substitution via${PIP_TARGET}
), or drop thepip.conf
entirely and rely on the ENV alone.Benefit: single-source-of-truth, easier maintenance, fewer image rebuilds.
runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (1)
6-6
: Fix minor grammar & capitalisation-# in case the details are need please follow the comprehensive list of python dependencies present here: +# in case the details are needed, please follow the comprehensive list of Python dependencies present here:runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf (1)
1-1
: Add SPDX-style license header instead of a “Copied from:” commentA single‐line SPDX identifier (e.g.
# SPDX-License-Identifier: Apache-2.0
) is clearer for attribution/compliance tooling than a prose “Copied from” note and avoids stale URLs.runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml (1)
5-6
: Fail fast if referenced patch file is missingConsider adding
missingFilePathPolicy: Error
to avoid silent no-op whenpod-patch.yaml
is renamed or deleted.runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Minor grammar tweak in the explanatory comment“in case the details are need”
-# in case the details are need please follow the comprehensive list of python dependencies present here: +# If details are needed, please refer to the comprehensive dependency list here:runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml (1)
1-10
: Consider pinning image by digest for reproducible deploymentsTag-only references (
runtime-rocm-pytorch-ubi9-python-3.12
) are mutable. Re-pushes will roll out unintended code to existing clusters. Pinning by digest (or using Kustomize’simages: newTag
+digest:
) locks deployments while still letting CI update tags intentionally.runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (1)
12-13
: Placeholder command hides real readiness behaviour
while true ; do date …
is useful for smoke tests but gives no indication that the actual runtime (Jupyter/Lab) is healthy. Add TODO or readiness probe before promoting this manifest beyond unit tests.runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Grammar tweak for clarity
“in case the details are need” → “in case the details are needed”.-# in case the details are need please follow the comprehensive list of python dependencies present here: +# in case the details are needed please follow the comprehensive list of python dependencies present here:runtimes/minimal/ubi9-python-3.12/Pipfile (1)
11-13
:ipython-genutils
is EOL – consider dropping the dependencyThe library has been archived since 2020 and none of the listed deps require it on modern Python. If nothing in the container imports it, remove the pin to shrink the image and cut CVE surface.
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (1)
10-22
: Infinitedate
loop spams logs & lacks SecurityContextThe busy-loop prints one line per second forever, flooding cluster logging back-ends.
If this pod is only for CI smoke-tests, consider a finite sleep orexit 0
.Add a
securityContext
similar to the PyTorch comment above to satisfy PodSecurity standards.runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml (1)
8-10
: Consider pinning image by digest, not mutable tagUsing the mutable tag
runtime-rocm-tensorflow-ubi9-python-3.12
can break reproducibility when the image is rebuilt. Kustomize supports specifyingdigest:
alongsidenewName
.runtimes/pytorch/ubi9-python-3.12/Pipfile (1)
38-56
: Stale/unused dependencies inflate image size
ipython-genutils
is abandoned (EOL 2018) and not required by
IPython 9+. The same goes forMarkupSafe
pinning – Jinja2 vendors an
upper bound already. Dropping them trims several hundred KB and one
CVE-prone package.If they are truly needed by Elyra, please add a comment; otherwise
delete.runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh (1)
13-36
: Missingset -o pipefail
& existence checks hide linking errors
ln -sf
will happily create dangling symlinks if either the ROCm or
Torch library is absent, masking packaging problems. Add a quick guard:for lib in libamdhip64.so.6 librocblas.so.4; do [ -e "${ROCMLIB}/${lib}" ] || { echo "Missing ${lib}" >&2; exit 1; } doneAlso enable
pipefail
in the shebang for consistency withset -e
.runtimes/tensorflow/ubi9-python-3.12/Pipfile (1)
6-10
:dev-packages
are not installed in the final image
tf2onnx
placed under[dev-packages]
will be ignored because the
Dockerfile runspip install --deploy
. If the converter is genuinely
required at runtime (e.g., for KFP steps), move it to[packages]
otherwise drop it to keep the layer slim.runtimes/minimal/ubi9-python-3.12/requirements.txt (1)
47-52
: Version skew ofrequests
across runtimes
requests==2.32.4
is pinned here, while the ROCm-PyTorch and ROCm-TF Pipfiles still use~=2.32.3
. Pinning different patch versions unnecessarily fragments the CVE surface and cache layers. Recommend unifying to the same patch (latest) for all new 3.12 images.Also applies to: 635-637
runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile (1)
60-63
: Exact pinning ofsetuptools
/wheel
freezes security updates
Locking build back-ends to equality pins usually isn’t required inside runtime images (only build images). Consider~=78.1
to pick up silent CVE back-ports automatically.runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile (1)
24-26
:skl2onnx
/onnxconverter-common
pins still conflict withprotobuf>=5
If you droptf2onnx
(dev-packages section) the protobuf restriction is gone; you can move toonnxconverter-common~=1.14
which supports protobuf 5 and removes the manual pin here.runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda (1)
52-57
:yum upgrade -y
bloats image and hurts reproducibility
Full distro upgrades inside container layers pull ~400 MB of RPMs and couple the image to latest mirror state. Preferyum update-minimal --security
or pin to specific package NEVRs to keep images slim and deterministic.runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (1)
50-56
: OpenShift + pip permissions hack is brittle
chmod -R g+w /opt/app-root/lib/...
touches ~15 k files every build (slow). Switch to<pip install --user>
or setPYTHONUSERBASE
to/opt/app-root
, avoiding the recursive chmod.runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu (1)
54-54
: Fix typo in comment-# Copy Elyra dependencies for air-gapped enviroment +# Copy Elyra dependencies for air-gapped environmentruntimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm (1)
82-82
: Fix typo in comment-# Copy Elyra dependencies for air-gapped enviroment +# Copy Elyra dependencies for air-gapped environmentruntimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py (2)
170-170
: Fix typo in comment- # in the environment where the bootsrapper is running. + # in the environment where the bootstrapper is running.
769-769
: Add newline at end of fileif __name__ == "__main__": - main() + main() +runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (3)
170-170
: Fix typo in comment.- # in the environment where the bootsrapper is running. + # in the environment where the bootstrapper is running.
29-32
: Modernize type hints for Python 3.12.Since this bootstrapper is for Python 3.12, consider using modern type syntax instead of deprecated typing imports.
Update the imports:
-from typing import Dict -from typing import Optional -from typing import TypeThen update type annotations throughout the file:
# Line 62 - def get_instance(cls: Type[F], **kwargs: Any) -> F: + def get_instance(cls: type[F], **kwargs: Any) -> F: # Line 301 - def put_file_to_object_storage(self, file_to_upload: str, object_name: Optional[str] = None) -> None: + def put_file_to_object_storage(self, file_to_upload: str, object_name: str | None = None) -> None: # Line 338 - def convert_param_str_to_dict(self, pipeline_parameters: Optional[str] = None) -> Dict[str, Any]: + def convert_param_str_to_dict(self, pipeline_parameters: str | None = None) -> dict[str, Any]: # Line 725 - def log_operation_info(cls, action_clause: str, duration_secs: Optional[float] = None) -> None: + def log_operation_info(cls, action_clause: str, duration_secs: float | None = None) -> None:Also applies to: 62-62, 301-301, 338-338, 725-725
769-769
: Add newline at end of file.Add a newline character after the last line to follow Python conventions.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py (5)
29-31
: Modernize type annotations for Python 3.12Since this is a new Python 3.12 runtime, use modern type annotations instead of deprecated typing module imports.
Replace deprecated typing module imports with built-in types:
-from typing import Dict -from typing import Optional -from typing import Type +from typing import Any +from typing import TypeVarAnd update type annotations throughout:
-def get_instance(cls: Type[F], **kwargs: Any) -> F: +def get_instance(cls: type[F], **kwargs: Any) -> F:-def put_file_to_object_storage(self, file_to_upload: str, object_name: Optional[str] = None) -> None: +def put_file_to_object_storage(self, file_to_upload: str, object_name: str | None = None) -> None:-def convert_param_str_to_dict(self, pipeline_parameters: Optional[str] = None) -> Dict[str, Any]: +def convert_param_str_to_dict(self, pipeline_parameters: str | None = None) -> dict[str, Any]:-def log_operation_info(cls, action_clause: str, duration_secs: Optional[float] = None) -> None: +def log_operation_info(cls, action_clause: str, duration_secs: float | None = None) -> None:Also applies to: 62-62, 301-301, 338-338, 725-725
76-77
: Move imports to module levelMove minio imports to the top of the file for better code organization and static analysis.
Move these imports to the top of the file:
import minio from minio.credentials import providers
561-561
: Remove explicit object inheritanceIn Python 3, classes inherit from object by default.
-class OpUtil(object): +class OpUtil:
474-474
: Remove unused variablesThe variables
python_script_name
andr_script_name
are assigned but never used.# In PythonFileOp.execute(): python_script = os.path.basename(self.filepath) -python_script_name = python_script.replace(".py", "") # In RFileOp.execute(): r_script = os.path.basename(self.filepath) -r_script_name = r_script.replace(".r", "")Also applies to: 521-521
769-769
: Add newline at end of fileAdd a newline character at the end of the file to follow POSIX standards.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (6)
runtimes/datascience/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/minimal/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/tensorflow/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
📒 Files selected for processing (49)
Makefile
(2 hunks)runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
(1 hunks)runtimes/datascience/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/datascience/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/datascience/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu
(1 hunks)runtimes/minimal/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/minimal/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/minimal/ubi9-python-3.12/requirements.txt
(1 hunks)runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/minimal/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda
(1 hunks)runtimes/pytorch/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/kustomization.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/pytorch/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)
🧰 Additional context used
🧠 Learnings (49)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. The request follows the established pattern of systematic code quality improvements with comprehensive issue creation covering problem description, solution details, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:1-2
Timestamp: 2025-07-03T16:08:47.251Z
Learning: jiridanek requested GitHub issue creation for shell strict mode improvement in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1310 was created with comprehensive problem description covering silent failures, production risks, implementation guidance with code examples, acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements for the codeserver image entrypoint script.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf (9)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template:23-32
Timestamp: 2025-07-03T16:15:19.673Z
Learning: jiridanek requested GitHub issue creation for FastCGI configuration improvement in codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template during PR #1269 review, specifically for replacing hard-coded /opt/app-root path with $document_root variable to maintain DRY principle and improve maintainability. Issue #1311 was created with comprehensive architectural overview of FastCGI, NGINX, and supervisord integration for health checking and activity monitoring, designed for compatibility with JupyterHub idle culler and Kubeflow notebook controller culling systems.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
runtimes/rocm-pytorch/ubi9-python-3.12/utils/pip.conf (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template:23-32
Timestamp: 2025-07-03T16:15:19.673Z
Learning: jiridanek requested GitHub issue creation for FastCGI configuration improvement in codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template during PR #1269 review, specifically for replacing hard-coded /opt/app-root path with $document_root variable to maintain DRY principle and improve maintainability. Issue #1311 was created with comprehensive architectural overview of FastCGI, NGINX, and supervisord integration for health checking and activity monitoring, designed for compatibility with JupyterHub idle culler and Kubeflow notebook controller culling systems.
runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (5)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml (2)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/kustomization.yaml (5)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:987-989
Timestamp: 2025-06-28T15:06:03.344Z
Learning: In the opendatahub-io/notebooks repository, checks for broken links and missing files in kustomization manifests are already performed by `ci/kustomize.sh`, which is invoked from `.github/workflows/code-quality.yaml` (lines 112–116). No additional pytest is needed for this purpose.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml (5)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:987-989
Timestamp: 2025-06-28T15:06:03.344Z
Learning: In the opendatahub-io/notebooks repository, checks for broken links and missing files in kustomization manifests are already performed by `ci/kustomize.sh`, which is invoked from `.github/workflows/code-quality.yaml` (lines 112–116). No additional pytest is needed for this purpose.
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-tensorflow-notebook-imagestream.yaml:45-45
Timestamp: 2025-06-13T08:34:01.300Z
Learning: When updating dependency versions in `manifests/base/*-imagestream.yaml`, the project convention is to modify only the newest tag (e.g., "2025.1") and intentionally leave earlier tags (e.g., "2024.2") unchanged.
runtimes/datascience/ubi9-python-3.12/kustomize/base/kustomization.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:987-989
Timestamp: 2025-06-28T15:06:03.344Z
Learning: In the opendatahub-io/notebooks repository, checks for broken links and missing files in kustomization manifests are already performed by `ci/kustomize.sh`, which is invoked from `.github/workflows/code-quality.yaml` (lines 112–116). No additional pytest is needed for this purpose.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1151
File: jupyter/tensorflow/ubi9-python-3.12/kustomize/base/statefulset.yaml:11-17
Timestamp: 2025-07-01T06:50:37.115Z
Learning: jiridanek manages StatefulSet selector issues systematically across multiple images in opendatahub-io/notebooks. When the same configuration issue (empty spec.selector and template.metadata.labels) appears in different images like jupyter/minimal and jupyter/tensorflow, he tracks them under a single coordinated effort rather than creating duplicate issues for each affected image.
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
runtimes/tensorflow/ubi9-python-3.12/utils/pip.conf (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template:23-32
Timestamp: 2025-07-03T16:15:19.673Z
Learning: jiridanek requested GitHub issue creation for FastCGI configuration improvement in codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template during PR #1269 review, specifically for replacing hard-coded /opt/app-root path with $document_root variable to maintain DRY principle and improve maintainability. Issue #1311 was created with comprehensive architectural overview of FastCGI, NGINX, and supervisord integration for health checking and activity monitoring, designed for compatibility with JupyterHub idle culler and Kubeflow notebook controller culling systems.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
runtimes/pytorch/ubi9-python-3.12/utils/pip.conf (8)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template:23-32
Timestamp: 2025-07-03T16:15:19.673Z
Learning: jiridanek requested GitHub issue creation for FastCGI configuration improvement in codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template during PR #1269 review, specifically for replacing hard-coded /opt/app-root path with $document_root variable to maintain DRY principle and improve maintainability. Issue #1311 was created with comprehensive architectural overview of FastCGI, NGINX, and supervisord integration for health checking and activity monitoring, designed for compatibility with JupyterHub idle culler and Kubeflow notebook controller culling systems.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template:23-32
Timestamp: 2025-07-03T16:15:19.673Z
Learning: jiridanek requested GitHub issue creation for FastCGI configuration improvement in codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template during PR #1269 review, specifically for replacing hard-coded /opt/app-root path with $document_root variable to maintain DRY principle and improve maintainability. The issue includes comprehensive architectural overview of FastCGI, NGINX, and supervisord integration for health checking and activity monitoring, designed for compatibility with JupyterHub idle culler and Kubeflow notebook controller culling systems.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template:23-32
Timestamp: 2025-07-03T16:15:19.673Z
Learning: jiridanek requested GitHub issue creation for FastCGI configuration improvement in codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template during PR #1269 review, specifically for replacing hard-coded /opt/app-root path with $document_root variable to maintain DRY principle and improve maintainability. The issue includes comprehensive architectural overview of FastCGI, NGINX, and supervisord integration for health checking and activity monitoring, designed for compatibility with JupyterHub idle culler and Kubeflow notebook controller culling systems.
runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml (4)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
runtimes/datascience/ubi9-python-3.12/utils/pip.conf (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template:23-32
Timestamp: 2025-07-03T16:15:19.673Z
Learning: jiridanek requested GitHub issue creation for FastCGI configuration improvement in codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template during PR #1269 review, specifically for replacing hard-coded /opt/app-root path with $document_root variable to maintain DRY principle and improve maintainability. Issue #1311 was created with comprehensive architectural overview of FastCGI, NGINX, and supervisord integration for health checking and activity monitoring, designed for compatibility with JupyterHub idle culler and Kubeflow notebook controller culling systems.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
runtimes/minimal/ubi9-python-3.12/utils/pip.conf (5)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template:23-32
Timestamp: 2025-07-03T16:15:19.673Z
Learning: jiridanek requested GitHub issue creation for FastCGI configuration improvement in codeserver/ubi9-python-3.12/nginx/serverconf/proxy.conf.template during PR #1269 review, specifically for replacing hard-coded /opt/app-root path with $document_root variable to maintain DRY principle and improve maintainability. Issue #1311 was created with comprehensive architectural overview of FastCGI, NGINX, and supervisord integration for health checking and activity monitoring, designed for compatibility with JupyterHub idle culler and Kubeflow notebook controller culling systems.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
runtimes/minimal/ubi9-python-3.12/Pipfile (10)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.
runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml (8)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:987-989
Timestamp: 2025-06-28T15:06:03.344Z
Learning: In the opendatahub-io/notebooks repository, checks for broken links and missing files in kustomization manifests are already performed by `ci/kustomize.sh`, which is invoked from `.github/workflows/code-quality.yaml` (lines 112–116). No additional pytest is needed for this purpose.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
runtimes/datascience/ubi9-python-3.12/Pipfile (10)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-tensorflow-notebook-imagestream.yaml:45-45
Timestamp: 2025-06-13T08:34:01.300Z
Learning: When updating dependency versions in `manifests/base/*-imagestream.yaml`, the project convention is to modify only the newest tag (e.g., "2025.1") and intentionally leave earlier tags (e.g., "2024.2") unchanged.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Makefile (11)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-tensorflow-notebook-imagestream.yaml:45-45
Timestamp: 2025-06-13T08:34:01.300Z
Learning: When updating dependency versions in `manifests/base/*-imagestream.yaml`, the project convention is to modify only the newest tag (e.g., "2025.1") and intentionally leave earlier tags (e.g., "2024.2") unchanged.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.
runtimes/tensorflow/ubi9-python-3.12/Pipfile (10)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile (9)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/rocm/pytorch/ubi9-python-3.11/requirements.txt:987-989
Timestamp: 2025-06-28T15:06:03.344Z
Learning: In the opendatahub-io/notebooks repository, checks for broken links and missing files in kustomization manifests are already performed by `ci/kustomize.sh`, which is invoked from `.github/workflows/code-quality.yaml` (lines 112–116). No additional pytest is needed for this purpose.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda (14)
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:35:34.805Z
Learning: In the opendatahub-io/notebooks repository, mounting emptyDir volumes over /opt/app-root/src is intentional behavior that matches production deployment patterns where odh-dashboard mounts empty PVCs at this location (the $HOME directory). This mounting is expected to hide base image content.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: OpenShift CI infrastructure issues in opendatahub-io/notebooks can manifest as "ContainerFailed one or more containers exited" errors in release steps, or as "Entrypoint received interrupt: terminated" messages when pods are killed during CI runs. These are typically infrastructure-level issues rather than code problems.
runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda (14)
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.
runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh (5)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm (13)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (8)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile (9)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.
runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm (13)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:35:34.805Z
Learning: In the opendatahub-io/notebooks repository, mounting emptyDir volumes over /opt/app-root/src is intentional behavior that matches production deployment patterns where odh-dashboard mounts empty PVCs at this location (the $HOME directory). This mounting is expected to hide base image content.
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu (16)
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:35:34.805Z
Learning: In the opendatahub-io/notebooks repository, mounting emptyDir volumes over /opt/app-root/src is intentional behavior that matches production deployment patterns where odh-dashboard mounts empty PVCs at this location (the $HOME directory). This mounting is expected to hide base image content.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: OpenShift CI infrastructure issues in opendatahub-io/notebooks can manifest as "ContainerFailed one or more containers exited" errors in release steps, or as "Entrypoint received interrupt: terminated" messages when pods are killed during CI runs. These are typically infrastructure-level issues rather than code problems.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.
runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/minimal/ubi9-python-3.12/kustomize/base/kustomization.yaml (3)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
runtimes/minimal/ubi9-python-3.12/requirements.txt (4)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1123
File: runtimes/tensorflow/ubi9-python-3.11/Pipfile:56-56
Timestamp: 2025-06-26T11:39:13.498Z
Learning: pip installing packages from PyPI typically downloads pre-built wheels that just get extracted - setuptools is not involved in this process. setuptools is primarily used for building packages from source (setup.py, pyproject.toml), creating entry points, and package discovery during build time, not for installing pre-built wheels.
runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu (15)
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:35:34.805Z
Learning: In the opendatahub-io/notebooks repository, mounting emptyDir volumes over /opt/app-root/src is intentional behavior that matches production deployment patterns where odh-dashboard mounts empty PVCs at this location (the $HOME directory). This mounting is expected to hide base image content.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: OpenShift CI infrastructure issues in opendatahub-io/notebooks can manifest as "ContainerFailed one or more containers exited" errors in release steps, or as "Entrypoint received interrupt: terminated" messages when pods are killed during CI runs. These are typically infrastructure-level issues rather than code problems.
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py (1)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (1)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py (2)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
runtimes/pytorch/ubi9-python-3.12/Pipfile (10)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py (1)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py (2)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
🧬 Code Graph Analysis (3)
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py (3)
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (24)
FileOpBase
(54-355)get_instance
(62-72)NotebookFileOp
(358-465)PythonFileOp
(468-512)RFileOp
(515-558)convert_param_str_to_dict
(338-348)process_dependencies
(114-137)OpUtil
(561-741)log_operation_info
(725-741)get_file_from_object_storage
(287-299)process_outputs
(139-155)process_output_file
(324-336)process_metrics_and_metadata
(157-277)get_object_storage_filename
(279-285)put_file_to_object_storage
(301-318)has_wildcard
(320-322)set_parameters_in_env
(350-355)find_best_kernel
(422-465)convert_notebook_to_html
(398-419)package_install
(565-615)determine_elyra_requirements
(618-626)package_list_to_dict
(629-653)parse_arguments
(656-722)main
(744-765)ci/cached-builds/make_test.py (1)
call
(123-124)tests/containers/socket_proxy.py (1)
run
(156-168)
runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py (1)
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (7)
FileOpBase
(54-355)get_instance
(62-72)NotebookFileOp
(358-465)PythonFileOp
(468-512)RFileOp
(515-558)OpUtil
(561-741)main
(744-765)
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py (3)
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (20)
FileOpBase
(54-355)get_instance
(62-72)NotebookFileOp
(358-465)PythonFileOp
(468-512)RFileOp
(515-558)convert_param_str_to_dict
(338-348)process_dependencies
(114-137)OpUtil
(561-741)log_operation_info
(725-741)get_file_from_object_storage
(287-299)process_outputs
(139-155)process_output_file
(324-336)process_metrics_and_metadata
(157-277)put_file_to_object_storage
(301-318)set_parameters_in_env
(350-355)find_best_kernel
(422-465)convert_notebook_to_html
(398-419)package_install
(565-615)parse_arguments
(656-722)main
(744-765)ci/cached-builds/make_test.py (1)
call
(123-124)tests/containers/socket_proxy.py (1)
run
(156-168)
🪛 LanguageTool
runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
🪛 Checkov (3.2.334)
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml
[MEDIUM] 2-20: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-20: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml
[MEDIUM] 2-13: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-13: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
🪛 Gitleaks (8.26.0)
runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
45-45: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
🪛 Ruff (0.11.9)
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
🔇 Additional comments (13)
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml (1)
2-3
: apiVersion still in v1alpha1 – confirm compatibility with the kustomize version used in CI
kustomize.config.k8s.io/v1alpha1
for Components was deprecated in kustomize v5 in favour ofv1beta1
. Builds that vendor a newer binary will error out.runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/kustomization.yaml (1)
5-6
: Ensure your build pipeline supports the alphacomponents
featureThe
components
field is still flagged “alpha” in upstream kustomize. Older clusters or CI images (e.g.oc kustomize
) ignore it, resulting in a silently incomplete manifest. Confirm that the CI image & OpenShift builders are ≥ v4.5 with--enable_alpha_plugins
support.runtimes/pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml (1)
8-10
: LGTM – Image substitution looks correct
runtime-workbench
will be rewritten to
quay.io/opendatahub/workbench-images:runtime-pytorch-ubi9-python-3.12
, which matches the project's tagging convention.runtimes/datascience/ubi9-python-3.12/kustomize/base/kustomization.yaml (1)
1-10
: Looks good – file is minimal and consistent with existing patterns
No issues found.runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (1)
12-13
: Placeholder command requires follow-up
An endlessdate
loop is fine for CI smoke tests but unsuitable for production. Confirm it’s only referenced by test overlays and not shipped to users.runtimes/tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml (1)
1-10
: Consistent with other kustomizations – no action needed
File matches established structure; image override looks correct.runtimes/minimal/ubi9-python-3.12/kustomize/base/kustomization.yaml (1)
1-10
: Consistent and minimal – LGTM
No issues detected.runtimes/pytorch/ubi9-python-3.12/Pipfile (1)
14-18
: Verify wheel availability for Torch 2.6 + CUDA 12.6 on cp312 before pinning
torch==2.6.0+cu126
&torchvision==0.21.0+cu126
are future releases.
At the time of writing there are no official cp312 wheels on either
PyPI or the PyTorch nightly index for that combination. The build will
die inside the container.Consider:
-torch = {version = "==2.6.0+cu126", index = "pytorch"} -torchvision = {version = "==0.21.0+cu126", index = "pytorch"} +# Prefer the latest released series that ships cp312 wheels +torch = {version = "==2.3.1+cu124", index = "pytorch"} +torchvision = {version = "==0.18.1+cu124", index = "pytorch"}(or drop the strict pin and let the Dockerfile install the matching
pytorch
extra).runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile (2)
17-20
: Confirm wheel availability fortorch==2.6.0+rocm6.2.4
PyTorch ROCm wheels occasionally lag the source release; verify thatcp312-*-linux_x86_64.whl
exists in therocm6.2.4
index for all architectures you plan to multi-arch build (amd64 + arm64).
55-56
: Olderrequests
patch level
See comment inrequirements.txt
; align to2.32.4
or later for CVE-2024-37245 fix.runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu (1)
14-19
: LGTM! Architecture-specific package handling for s390x.The conditional installation of development tools for s390x architecture is appropriate.
runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm (1)
84-91
: LGTM! De-vendoring ROCm libraries for PyTorch.The de-vendor-torch.sh script properly replaces PyTorch's vendored ROCm libraries with system links, ensuring compatibility with the installed ROCm runtime.
runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda (1)
45-47
: LGTM! Proper GPG key verification.The SHA256 checksum verification for the NVIDIA GPG key is a good security practice. The static analysis warning about an API key is a false positive.
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml
Show resolved
Hide resolved
jupyter-client==8.6.3; python_version >= '3.8' \ | ||
--hash=sha256:35b3a0947c4a6e9d589eb97d7d4cd5e90f910ee73101611f01283732bd6d9419 \ | ||
--hash=sha256:e8a19cc986cc45905ac3362915f410f3af85424b4c0905e94fa5f2cb08e8f23f | ||
jupyter-core==5.7.2; python_version >= '3.8' \ |
Check failure
Code scanning / Trivy
Jupyter Core on Windows Has Uncontrolled Search Path Element Local Privilege Escalation Vulnerability High library
Installed Version: 5.7.2
Vulnerability CVE-2025-30167
Severity: HIGH
Fixed Version: 5.8.1
Link: CVE-2025-30167
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we don't care too much about windows, but is probably a good idea to look into possible updates
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
♻️ Duplicate comments (16)
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (1)
8-22
: AddsecurityContext
to prevent OpenShift deployment failures.The Pod lacks a
securityContext
which will cause failures on OpenShift due to PodSecurity policy violations. This is the same issue identified in previous reviews for similar runtime manifests.resources: limits: cpu: 500m memory: 500Mi requests: cpu: 500m memory: 500Mi + securityContext: + runAsNonRoot: true + allowPrivilegeEscalation: false + seccompProfile: + type: RuntimeDefault + capabilities: + drop: ["ALL"]runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml (1)
7-13
: MissingsecurityContext
, toleration, and GPUrequests
Previous review already highlighted that the CUDA overlay should include
‒ atolerations
entry fornvidia.com/gpu
,
‒ matching GPUrequests
, and
‒ asecurityContext
(runAsNonRoot, no privilege escalation, RuntimeDefault seccomp).Please incorporate that patch here.
runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml (1)
9-22
: Harden container withsecurityContext
; avoid PSA/SCC violationsThe container still runs without a
securityContext
, which violates the
restricted
Pod Security standard. Add the samesecurityContext
block
(runAsNonRoot = true, allowPrivilegeEscalation = false, seccompProfile = RuntimeDefault)
previously suggested.runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml (1)
8-22
: Same SCC/securityContext concern as in rocm-tensorflow podPlease apply the
securityContext
block suggested in the rocm-tensorflow review for parity.runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (1)
8-22
: Same SCC/securityContext concern as in rocm-tensorflow podPlease apply the
securityContext
block suggested in the rocm-tensorflow review for parity.runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Mirror the grammar fix from datascience variantApply the same “are needed” wording update for consistency.
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml (1)
8-20
: Security context & resource requests still missingThe container spec still has no
securityContext
(runAsNonRoot
,allowPrivilegeEscalation: false
,seccompProfile
) and CPU requests/limits are absent. OpenShift clusters runningrestricted-v2
SCC will reject the pod, and schedulers may refuse pods without CPU requests.The previous review already pointed this out; please address it or explain why it is being deferred.
runtimes/minimal/ubi9-python-3.12/requirements.txt (1)
782-787
:six==1.17.0
still pinned – package does not existThis exact version is not published on PyPI, so image builds will fail. The latest release is 1.16.0.
-six==1.17.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2' +six==1.16.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'Don’t forget to regenerate the hash pins afterwards (
pip hash …
or rerun the lock script).runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda (1)
20-24
:oc
client extraction runs as non-root – will fail on UBI images
/opt/app-root/bin
is owned by root (0755). Extracting the tarball while the effective UID is 1001 cannot createoc
, leading to permission-denied or an unexecutable binary.Move the download/extract block under
USER 0
, then switch back:-USER 1001 -# Install micropipenv ... -RUN pip install --no-cache-dir -U "micropipenv[toml]" - -# Install the oc client -RUN curl -L ... && tar -xzvf ... oc && rm -f ... +USER 1001 +# Install micropipenv +RUN pip install --no-cache-dir -U "micropipenv[toml]" + +# Install the oc client (needs root for /opt/app-root/bin) +USER 0 +RUN curl -L https://mirror.openshift.com/pub/openshift-v4/$(uname -m)/clients/ocp/stable/openshift-client-linux.tar.gz \ + -o /tmp/oc.tar.gz && \ + tar -xzvf /tmp/oc.tar.gz oc -C /opt/app-root/bin && \ + chmod 0755 /opt/app-root/bin/oc && \ + rm -f /tmp/oc.tar.gz +USER 1001Also consider checksum verification (tracked in issue #1241) to harden the download.
runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda (1)
38-38
: Support multi-architecture builds for NVARCHThe NVARCH environment variable is hardcoded to x86_64, limiting this image to x86_64 architecture only. This restricts the image to that architecture and prevents multi-architecture builds.
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py (1)
619-621
: Add Python 3.12 support to version checkThe version check doesn't include Python 3.12, which is the target version for this runtime image. This will cause the bootstrapper to fail when determining Elyra requirements.
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (1)
619-621
: Add Python 3.12 to supported versionsThe bootstrapper is being added for a Python 3.12 runtime but the version check doesn't include Python 3.12. This will cause the bootstrapper to fail with an error message.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py (2)
135-135
: Use subprocess.run with explicit argumentsThis issue was previously identified and remains unaddressed.
617-626
: Critical: Add Python 3.12 supportThis critical issue was previously identified and remains unaddressed. The bootstrapper will fail in the Python 3.12 runtime environment.
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py (2)
135-135
: Use subprocess.run with explicit argumentsThis issue was previously identified and remains unaddressed.
617-626
: Critical: Add Python 3.12 supportThis critical issue was previously identified and remains unaddressed. The bootstrapper will fail in the Python 3.12 runtime environment.
🧹 Nitpick comments (12)
runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (1)
6-6
: Fix grammatical error.The phrase "are need" should be "are needed" for proper grammar.
-# in case the details are need please follow the comprehensive list of python dependencies present here: +# in case the details are needed please follow the comprehensive list of python dependencies present here:runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (1)
11-11
: Consider using a more descriptive placeholder image name.The image name "runtime-workbench" could be more self-documenting. Based on similar patterns in the codebase, consider using a naming convention that clearly indicates this is a placeholder.
Would you like me to create an issue to establish consistent placeholder naming conventions across runtime manifests?
runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py (1)
76-77
: Move imports to top-level for better code organization.The
minio
imports are placed inside the__init__
method, which goes against Python best practices and static analysis recommendations.+import minio +from minio.credentials import providers from packaging import version # Inputs and Outputs separator character. If updated, # same-named variable in _notebook_op.py must be updated! INOUT_SEPARATOR = ";"Then update the constructor:
def __init__(self, **kwargs: Any) -> None: """Initializes the FileOpBase instance""" - import minio - from minio.credentials import providersruntimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Fix minor grammar in explanatory comment-# in case the details are need please follow the comprehensive list of python dependencies present here: +# If details are needed, please refer to the comprehensive list of Python dependencies here:runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Fix minor grammar in explanatory comment-# in case the details are need please follow the comprehensive list of python dependencies present here: +# If details are needed, please refer to the comprehensive list of Python dependencies here:runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Fix minor grammar in explanatory comment-# in case the details are need please follow the comprehensive list of python dependencies present here: +# If details are needed, please refer to the comprehensive list of Python dependencies here:runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (1)
11-13
: Replace placeholder image with kustomize-var or comment
runtime-workbench
gives no clue which image is expected and has already caused confusion in other images (see Issue #1313). Either:
- Use a meaningful default tag, e.g.
quay.io/opendatahub/workbench-images:runtime-rocm-tensorflow-ubi9-python-3.12
, or- Annotate that this will be overridden by
kustomize edit set image …
.runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Grammar nit: “are need” → “are needed”-# in case the details are need please follow the comprehensive list of python dependencies present here: +# in case the details are needed, please follow the comprehensive list of Python dependencies here:runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py (2)
613-613
: Add explicit check parameter for clarityFor consistency and best practices, explicitly specify the check parameter.
- subprocess.run([sys.executable, "-m", "pip", "freeze"]) + subprocess.run([sys.executable, "-m", "pip", "freeze"], check=False)
769-769
: Add trailing newlineAdd a newline at the end of the file for consistency with Python standards.
if __name__ == "__main__": main() +
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py (2)
613-613
: Add explicit check parameter for clarityFor consistency and best practices, explicitly specify the check parameter.
- subprocess.run([sys.executable, "-m", "pip", "freeze"]) + subprocess.run([sys.executable, "-m", "pip", "freeze"], check=False)
769-769
: Add trailing newlineAdd a newline at the end of the file for consistency with Python standards.
if __name__ == "__main__": main() +
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (6)
runtimes/datascience/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/minimal/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
runtimes/tensorflow/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
📒 Files selected for processing (49)
Makefile
(2 hunks)runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
(1 hunks)runtimes/datascience/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/datascience/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/datascience/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu
(1 hunks)runtimes/minimal/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/minimal/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/minimal/ubi9-python-3.12/requirements.txt
(1 hunks)runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/minimal/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda
(1 hunks)runtimes/pytorch/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/kustomization.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml
(1 hunks)runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/pytorch/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/Pipfile
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/utils/pip.conf
(1 hunks)runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)
✅ Files skipped from review due to trivial changes (4)
- runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh
- runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile
- runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile
🚧 Files skipped from review as they are similar to previous changes (22)
- runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml
- runtimes/minimal/ubi9-python-3.12/utils/pip.conf
- runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/kustomization.yaml
- runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf
- runtimes/tensorflow/ubi9-python-3.12/utils/pip.conf
- runtimes/pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/pytorch/ubi9-python-3.12/utils/pip.conf
- runtimes/minimal/ubi9-python-3.12/Pipfile
- runtimes/datascience/ubi9-python-3.12/utils/pip.conf
- runtimes/rocm-pytorch/ubi9-python-3.12/utils/pip.conf
- runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/datascience/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
- runtimes/pytorch/ubi9-python-3.12/Pipfile
- runtimes/minimal/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/datascience/ubi9-python-3.12/Pipfile
- runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu
- runtimes/tensorflow/ubi9-python-3.12/Pipfile
- Makefile
- runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm
- runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm
🧰 Additional context used
🧠 Learnings (23)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. The request follows the established pattern of systematic code quality improvements with comprehensive issue creation covering problem description, solution details, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:1-2
Timestamp: 2025-07-03T16:08:47.251Z
Learning: jiridanek requested GitHub issue creation for shell strict mode improvement in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1310 was created with comprehensive problem description covering silent failures, production risks, implementation guidance with code examples, acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements for the codeserver image entrypoint script.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu:21-24
Timestamp: 2025-07-01T06:48:13.154Z
Learning: jiridanek creates comprehensive follow-up issues from review comments that expand scope appropriately, include clear acceptance criteria, proper backlinks, and structured implementation guidance. Issue #1241 demonstrates this by turning a specific oc client checksum concern into a thorough security enhancement plan covering all downloaded binaries across the Python 3.12 implementation.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. Issue #1312 was successfully created with comprehensive problem description covering hard-coded URL limitations, timeout protection, error handling, acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.
runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda (19)
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template:14-14
Timestamp: 2025-07-03T16:28:55.757Z
Learning: jiridanek requested GitHub issue creation for passwd template validation in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/passwd.template during PR #1269 review. Issue #1318 was created with comprehensive analysis of nss_wrapper approach advantages over OpenShift's native user management, including consistent user identity, application compatibility, and debugging benefits. The current approach works with OpenShift's random UID assignment by dynamically setting USER_ID and GROUP_ID to actual runtime values while providing enhanced user attributes through nss_wrapper.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:35:34.805Z
Learning: In the opendatahub-io/notebooks repository, mounting emptyDir volumes over /opt/app-root/src is intentional behavior that matches production deployment patterns where odh-dashboard mounts empty PVCs at this location (the $HOME directory). This mounting is expected to hide base image content.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: OpenShift CI infrastructure issues in opendatahub-io/notebooks can manifest as "ContainerFailed one or more containers exited" errors in release steps, or as "Entrypoint received interrupt: terminated" messages when pods are killed during CI runs. These are typically infrastructure-level issues rather than code problems.
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (9)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:1-2
Timestamp: 2025-07-03T16:08:47.251Z
Learning: jiridanek requested GitHub issue creation for shell strict mode improvement in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1310 was created with comprehensive problem description covering silent failures, production risks, implementation guidance with code examples, acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements for the codeserver image entrypoint script.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/utils/process.sh:1-2
Timestamp: 2025-07-03T16:00:46.191Z
Learning: jiridanek requested GitHub issue creation for shell script strict-mode improvement in codeserver/ubi9-python-3.12/utils/process.sh during PR #1269 review. Issue #1303 was created with comprehensive problem description covering silent failures and production risks, phased acceptance criteria for basic strict-mode implementation and enhanced error handling, implementation guidance with code examples and flag explanations, benefits section, and proper context linking, continuing the established pattern of systematic code quality improvements.
runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml (12)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm:43-55
Timestamp: 2025-07-01T06:48:21.070Z
Learning: When security concerns are raised during PR reviews in opendatahub-io/notebooks, comprehensive follow-up issues are created (often by CodeRabbit) to track all related security enhancements with clear acceptance criteria and implementation guidance. This ensures security improvements are systematically addressed in dedicated efforts rather than blocking current deliverables.
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py (1)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml (5)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
runtimes/minimal/ubi9-python-3.12/requirements.txt (13)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/requirements.txt:435-444
Timestamp: 2025-07-03T13:59:55.040Z
Learning: jiridanek requested GitHub issue creation for numpy/scipy compatibility investigation in PR #1269, specifically for cases where theoretical version conflicts don't manifest as actual build failures. Issue #1297 was created with comprehensive investigation framework, acceptance criteria, runtime testing approach, and proper context linking, demonstrating the pattern of creating investigation-type issues for apparent but non-blocking technical concerns that require deeper understanding.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:21:12.994Z
Learning: When Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` doesn't match the `@playwright/test` version in `tests/browser/package.json`, it results in a runtime error: "Executable doesn't exist at /ms-playwright/webkit-2182/pw_run.sh" with a helpful message box showing current vs required versions (e.g., "current: mcr.microsoft.com/playwright:v1.52.0-noble" vs "required: mcr.microsoft.com/playwright:v1.53.1-noble").
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1123
File: runtimes/tensorflow/ubi9-python-3.11/Pipfile:56-56
Timestamp: 2025-06-26T11:39:13.498Z
Learning: pip installing packages from PyPI typically downloads pre-built wheels that just get extracted - setuptools is not involved in this process. setuptools is primarily used for building packages from source (setup.py, pyproject.toml), creating entry points, and package discovery during build time, not for installing pre-built wheels.
runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (5)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml (11)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml (11)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. The request follows the established pattern of systematic code quality improvements with comprehensive issue creation covering problem description, solution details, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py (9)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:21:12.994Z
Learning: When Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` doesn't match the `@playwright/test` version in `tests/browser/package.json`, it results in a runtime error: "Executable doesn't exist at /ms-playwright/webkit-2182/pw_run.sh" with a helpful message box showing current vs required versions (e.g., "current: mcr.microsoft.com/playwright:v1.52.0-noble" vs "required: mcr.microsoft.com/playwright:v1.53.1-noble").
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (11)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-tensorflow-notebook-imagestream.yaml:45-45
Timestamp: 2025-06-13T08:34:01.300Z
Learning: When updating dependency versions in `manifests/base/*-imagestream.yaml`, the project convention is to modify only the newest tag (e.g., "2025.1") and intentionally leave earlier tags (e.g., "2024.2") unchanged.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py (2)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (8)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda (15)
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:40-42
Timestamp: 2025-07-04T17:08:02.399Z
Learning: In the opendatahub-io/notebooks repository, when using multi-architecture Dockerfiles with BuildKit, the implicit build argument TARGETARCH is automatically available in the global scope for FROM instructions without explicit declaration. However, if TARGETARCH is used within a build stage, it must be declared explicitly within that stage. The current placement pattern (declaring ARG TARGETARCH after FROM instructions that use it) is correct for modern Docker/Podman/Buildah environments and does not require compatibility with older Docker versions.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: jupyter/minimal/ubi9-python-3.11/Dockerfile.cuda:29-38
Timestamp: 2025-07-04T17:07:52.656Z
Learning: In the opendatahub-io/notebooks repository, modern Docker with BuildKit automatically provides build arguments like TARGETARCH in the global scope for FROM instructions, but these arguments must be explicitly declared with ARG statements inside build stages where they will be used. The ARG declaration should be placed within the stage that uses it, not moved to the global scope, as this is the correct pattern for modern Docker/Podman/Buildah environments.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-tensorflow-notebook-imagestream.yaml:45-45
Timestamp: 2025-06-13T08:34:01.300Z
Learning: When updating dependency versions in `manifests/base/*-imagestream.yaml`, the project convention is to modify only the newest tag (e.g., "2025.1") and intentionally leave earlier tags (e.g., "2024.2") unchanged.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (5)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
🪛 Gitleaks (8.26.0)
runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda
45-45: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
45-45: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
🪛 Ruff (0.11.9)
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
🪛 Checkov (3.2.334)
runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml
[MEDIUM] 2-20: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-20: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml
[MEDIUM] 2-13: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-13: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
🪛 LanguageTool
runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
🪛 GitHub Check: Trivy
runtimes/minimal/ubi9-python-3.12/requirements.txt
[failure] 289-289: Jupyter Core on Windows Has Uncontrolled Search Path Element Local Privilege Escalation Vulnerability
Package: jupyter-core
Installed Version: 5.7.2
Vulnerability CVE-2025-30167
Severity: HIGH
Fixed Version: 5.8.1
Link: CVE-2025-30167
🔇 Additional comments (3)
runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda (1)
1-177
: LGTM: Well-structured CUDA TensorFlow runtime DockerfileThe Dockerfile follows established patterns for CUDA-enabled containers with proper multi-stage builds, NVIDIA upstream consistency, and appropriate environment configuration for TensorFlow with CUDA support.
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py (1)
1-769
: LGTM: Comprehensive KFP bootstrapper implementationThe bootstrapper provides robust support for executing notebooks, Python scripts, and R scripts within Kubeflow Pipelines environments, with proper cloud object storage integration, metadata handling, and error management.
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (1)
1-769
: LGTM: Consistent bootstrapper implementationThe bootstrapper implementation is consistent with other runtime environments and provides the necessary functionality for KFP pipeline execution with proper cloud object storage and metadata integration.
"pure-eval": { | ||
"hashes": [ | ||
"sha256:1db8e35b67b3d218d818ae653e27f06c3aa420901fa7b081ca98cbedc874e0d0", | ||
"sha256:5f4e983f40564c576c7c8635ae88db5956bb2229d7e9237d03b3c0b0190eaf42" | ||
], | ||
"version": "==0.2.3" | ||
}, | ||
"pycparser": { | ||
"hashes": [ |
Check failure
Code scanning / Trivy
Jupyter Core on Windows Has Uncontrolled Search Path Element Local Privilege Escalation Vulnerability High library
Installed Version: 5.7.2
Vulnerability CVE-2025-30167
Severity: HIGH
Fixed Version: 5.8.1
Link: CVE-2025-30167
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
♻️ Duplicate comments (17)
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (1)
8-22
: MissingsecurityContext
violates OpenShift restricted SCC
Same observation already raised earlier; reiterating for visibility.
AddrunAsNonRoot
,allowPrivilegeEscalation: false
, andseccompProfile
to avoid pod admission failure.runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml (1)
7-13
: Still missing toleration, requests & securityContext – see earlier reviewThe overlay only sets
limits.nvidia.com/gpu
and omits:
• toleration fornvidia.com/gpu=true:NoSchedule
• matchingrequests
entry
• container‐levelsecurityContext
(runAsNonRoot, no privilege-escalation)The same feedback was already provided on the previous commit; please reuse that patch.
runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml (1)
9-22
: Add a restrictivesecurityContext
to keep the pod schedulableOpenShift’s restricted SCC (and PSA “restricted”) will refuse pods that run as root or allow privilege escalation.
Add the following under theruntime
container:imagePullPolicy: Always + securityContext: + allowPrivilegeEscalation: false + runAsNonRoot: true + seccompProfile: + type: RuntimeDefaultWithout it, runtime tests will time-out on clusters enforcing SCC/PSA.
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (1)
9-22
: MissingsecurityContext
– duplicate of prior feedbackSame issue as flagged for other runtimes: container runs as root and allows privilege escalation, causing SCC/PSA rejection.
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml (1)
7-20
: Harden container & bound tmpfs sizeSame SCC constraints apply here.
Also, theemptyDir
volume has nosizeLimit
, so a runaway process can OOM the node.- name: runtime + securityContext: + runAsNonRoot: true + allowPrivilegeEscalation: false + seccompProfile: + type: RuntimeDefault + capabilities: + drop: ["ALL"] resources: limits: memory: 6Gi + cpu: "4" requests: memory: 6Gi + cpu: "4" @@ emptyDir: medium: Memory + sizeLimit: 2Giruntimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (1)
9-22
: MissingsecurityContext
will break on OpenShift — see earlier review.runtimes/minimal/ubi9-python-3.12/requirements.txt (2)
289-291
:jupyter-core==5.7.2
vulnerable (CVE-2025-30167) – bump to 5.8.1+
783-787
:six==1.17.0
does not exist on PyPI – build will 404runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (1)
619-621
: Add Python 3.12 to supported versions.The bootstrapper is being added for a Python 3.12 runtime but the version check doesn't include Python 3.12. This will cause the bootstrapper to fail with an error message.
- if sys.version_info.minor in [8, 9, 10, 11]: + if sys.version_info.minor in [8, 9, 10, 11, 12]:runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py (2)
135-135
: Add error handling for tar extraction.The tar extraction command uses
subprocess.call()
without error checking, which could silently fail and cause issues downstream.- subprocess.call(["tar", "-zxvf", archive_file]) + result = subprocess.run(["tar", "-zxvf", archive_file], capture_output=True, text=True) + if result.returncode != 0: + raise RuntimeError(f"Failed to extract archive {archive_file}: {result.stderr}")
618-626
: Add Python 3.12 support to Elyra requirements determination.The current implementation only supports Python 3.8-3.11, but this is a Python 3.12 runtime environment. This will cause the bootstrapper to fail at runtime.
@classmethod def determine_elyra_requirements(cls) -> Any: if sys.version_info.major == 3: - if sys.version_info.minor in [8, 9, 10, 11]: + if sys.version_info.minor in [8, 9, 10, 11, 12]: return "requirements-elyra.txt" logger.error( f"This version of Python '{sys.version_info.major}.{sys.version_info.minor}' " f"is not supported for Elyra generic components" ) return Noneruntimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py (2)
1-769
: Code duplication issue already identified.This bootstrapper duplication across runtime environments has already been flagged and a comprehensive GitHub issue has been created to address the consolidation of identical implementations.
618-626
: Python 3.12 support issue already identified.The missing Python 3.12 support in the
determine_elyra_requirements
method has already been flagged in previous review comments.runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py (3)
1-769
: Known issue: Bootstrapper code duplicationBased on the retrieved learnings, this is a duplicate of the bootstrapper code that exists in multiple runtime environments. This duplication issue is already being tracked for consolidation to improve maintainability and consistency across runtime environments.
135-135
: Use subprocess.run with explicit error handlingReplace
subprocess.call
withsubprocess.run
for better error handling.-subprocess.call(["tar", "-zxvf", archive_file]) +subprocess.run(["tar", "-zxvf", archive_file], check=True)
617-626
: Critical: Add Python 3.12 supportThe
determine_elyra_requirements
method doesn't support Python 3.12, which will cause the bootstrapper to fail in this Python 3.12 runtime environment.def determine_elyra_requirements(cls) -> Any: if sys.version_info.major == 3: - if sys.version_info.minor in [8, 9, 10, 11]: + if sys.version_info.minor in [8, 9, 10, 11, 12]: return "requirements-elyra.txt" logger.error( f"This version of Python '{sys.version_info.major}.{sys.version_info.minor}' " f"is not supported for Elyra generic components" ) return Noneruntimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py (1)
619-626
: Add Python 3.12 support to version checkThe version check doesn't include Python 3.12, which is the target version for this runtime image.
@classmethod def determine_elyra_requirements(cls) -> Any: if sys.version_info.major == 3: - if sys.version_info.minor in [8, 9, 10, 11]: + if sys.version_info.minor in [8, 9, 10, 11, 12]: return "requirements-elyra.txt" logger.error( f"This version of Python '{sys.version_info.major}.{sys.version_info.minor}' " f"is not supported for Elyra generic components" ) return None
🧹 Nitpick comments (7)
runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Fix minor grammar in explanatory comment
“…in case the details are need…” should be “…in case the details are needed…”.
This small polish keeps the placeholder file professional.-# in case the details are need please follow the comprehensive list of python dependencies present here: +# in case the details are needed please follow the comprehensive list of python dependencies present here:runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Fix minor grammar in explanatory comment
Same wording issue as in the TensorFlow runtime placeholder.-# in case the details are need please follow the comprehensive list of python dependencies present here: +# in case the details are needed please follow the comprehensive list of python dependencies present here:runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Fix minor grammar in explanatory comment-# in case the details are need please follow the comprehensive list of python dependencies present here: +# in case the details are needed please follow the comprehensive list of python dependencies present here:runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Fix minor grammar in explanatory comment-# in case the details are need please follow the comprehensive list of python dependencies present here: +# in case the details are needed please follow the comprehensive list of python dependencies present here:runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt (1)
6-6
: Typo: use “needed” instead of “need”-# in case the details are need please follow the comprehensive list of python dependencies present here: +# in case the details are needed please follow the comprehensive list of python dependencies present here:runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt (1)
4-7
: Fix minor grammar + clarify commentThe phrase “in case the details are need” should read “in case the details are needed”.
While trivial, polishing comments avoids confusion for future maintainers.-# in case the details are need please follow the comprehensive list of python dependencies present here: +# In case the details are needed, please follow the comprehensive list of python dependencies present here:runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (1)
12-13
: Avoid log spam – change placeholder command
while true; do date; …
prints a line every second and can quickly fill cluster logs.
Throttle or replace with a no-op health probe when the real entrypoint is unknown.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (6)
-
runtimes/datascience/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
-
runtimes/minimal/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
-
runtimes/pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
-
runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
-
runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
-
runtimes/tensorflow/ubi9-python-3.12/Pipfile.lock
is excluded by!**/*.lock
📒 Files selected for processing (49)
-
Makefile
(2 hunks) -
runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
(1 hunks) -
runtimes/datascience/ubi9-python-3.12/Pipfile
(1 hunks) -
runtimes/datascience/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks) -
runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks) -
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks) -
runtimes/datascience/ubi9-python-3.12/utils/pip.conf
(1 hunks) -
runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks) -
runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu
(1 hunks) -
runtimes/minimal/ubi9-python-3.12/Pipfile
(1 hunks) -
runtimes/minimal/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks) -
runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks) -
runtimes/minimal/ubi9-python-3.12/requirements.txt
(1 hunks) -
runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks) -
runtimes/minimal/ubi9-python-3.12/utils/pip.conf
(1 hunks) -
runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/Pipfile
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/kustomization.yaml
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/utils/pip.conf
(1 hunks) -
runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks) -
runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm
(1 hunks) -
runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile
(1 hunks) -
runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh
(1 hunks) -
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks) -
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks) -
runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks) -
runtimes/rocm-pytorch/ubi9-python-3.12/utils/pip.conf
(1 hunks) -
runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks) -
runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm
(1 hunks) -
runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile
(1 hunks) -
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks) -
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks) -
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks) -
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf
(1 hunks) -
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks) -
runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
(1 hunks) -
runtimes/tensorflow/ubi9-python-3.12/Pipfile
(1 hunks) -
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
(1 hunks) -
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
(1 hunks) -
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py
(1 hunks) -
runtimes/tensorflow/ubi9-python-3.12/utils/pip.conf
(1 hunks) -
runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
(1 hunks)
✅ Files skipped from review due to trivial changes (4)
- runtimes/rocm-pytorch/ubi9-python-3.12/Pipfile
- runtimes/rocm-pytorch/ubi9-python-3.12/de-vendor-torch.sh
- runtimes/rocm-tensorflow/ubi9-python-3.12/Pipfile
- runtimes/rocm-tensorflow/ubi9-python-3.12/Dockerfile.rocm
🚧 Files skipped from review as they are similar to previous changes (23)
- runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/kustomization.yaml
- runtimes/tensorflow/ubi9-python-3.12/utils/pip.conf
- runtimes/rocm-pytorch/ubi9-python-3.12/utils/pip.conf
- runtimes/pytorch/ubi9-python-3.12/utils/pip.conf
- runtimes/rocm-tensorflow/ubi9-python-3.12/utils/pip.conf
- runtimes/minimal/ubi9-python-3.12/utils/pip.conf
- runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/kustomization.yaml
- runtimes/datascience/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/datascience/ubi9-python-3.12/utils/pip.conf
- runtimes/pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/minimal/ubi9-python-3.12/Pipfile
- runtimes/minimal/ubi9-python-3.12/kustomize/base/kustomization.yaml
- Makefile
- runtimes/tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/kustomization.yaml
- runtimes/datascience/ubi9-python-3.12/Pipfile
- runtimes/pytorch/ubi9-python-3.12/Pipfile
- runtimes/pytorch/ubi9-python-3.12/Dockerfile.cuda
- runtimes/datascience/ubi9-python-3.12/Dockerfile.cpu
- runtimes/tensorflow/ubi9-python-3.12/Pipfile
- runtimes/tensorflow/ubi9-python-3.12/Dockerfile.cuda
- runtimes/rocm-pytorch/ubi9-python-3.12/Dockerfile.rocm
🧰 Additional context used
🧠 Learnings (23)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. The request follows the established pattern of systematic code quality improvements with comprehensive issue creation covering problem description, solution details, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:1-2
Timestamp: 2025-07-03T16:08:47.251Z
Learning: jiridanek requested GitHub issue creation for shell strict mode improvement in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1310 was created with comprehensive problem description covering silent failures, production risks, implementation guidance with code examples, acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements for the codeserver image entrypoint script.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/minimal/ubi9-python-3.12/Dockerfile.cpu:21-24
Timestamp: 2025-07-01T06:48:13.154Z
Learning: jiridanek creates comprehensive follow-up issues from review comments that expand scope appropriately, include clear acceptance criteria, proper backlinks, and structured implementation guidance. Issue #1241 demonstrates this by turning a specific oc client checksum concern into a thorough security enhancement plan covering all downloaded binaries across the Python 3.12 implementation.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with `extras = ["and-cuda"]` can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.
runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu (15)
Learnt from: atheo89
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing $(uname -m) with ${TARGETARCH} for cross-architecture builds, and adding OCI-compliant metadata labels. Issue #1332 was created with comprehensive problem description, phased implementation approach, detailed acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Dockerfile.rocm:34-37
Timestamp: 2025-07-02T18:19:23.024Z
Learning: In the opendatahub-io/notebooks repository, issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" covers the security concern about verifying downloaded binaries including the oc client. Red Hat does not provide signed RPM packages for the OpenShift CLI (oc) in UBI9 official repositories, making manual binary download with checksum verification the recommended approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:35:34.805Z
Learning: In the opendatahub-io/notebooks repository, mounting emptyDir volumes over /opt/app-root/src is intentional behavior that matches production deployment patterns where odh-dashboard mounts empty PVCs at this location (the $HOME directory). This mounting is expected to hide base image content.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: OpenShift CI infrastructure issues in opendatahub-io/notebooks can manifest as "ContainerFailed one or more containers exited" errors in release steps, or as "Entrypoint received interrupt: terminated" messages when pods are killed during CI runs. These are typically infrastructure-level issues rather than code problems.
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (10)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:1-2
Timestamp: 2025-07-03T16:08:47.251Z
Learning: jiridanek requested GitHub issue creation for shell strict mode improvement in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1310 was created with comprehensive problem description covering silent failures, production risks, implementation guidance with code examples, acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements for the codeserver image entrypoint script.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/utils/process.sh:1-2
Timestamp: 2025-07-03T16:00:46.191Z
Learning: jiridanek requested GitHub issue creation for shell script strict-mode improvement in codeserver/ubi9-python-3.12/utils/process.sh during PR #1269 review. Issue #1303 was created with comprehensive problem description covering silent failures and production risks, phased acceptance criteria for basic strict-mode implementation and enhanced error handling, implementation guidance with code examples and flag explanations, benefits section, and proper context linking, continuing the established pattern of systematic code quality improvements.
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py (10)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:21:12.994Z
Learning: When Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` doesn't match the `@playwright/test` version in `tests/browser/package.json`, it results in a runtime error: "Executable doesn't exist at /ms-playwright/webkit-2182/pw_run.sh" with a helpful message box showing current vs required versions (e.g., "current: mcr.microsoft.com/playwright:v1.52.0-noble" vs "required: mcr.microsoft.com/playwright:v1.53.1-noble").
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py (28)
<retrieved_learning>
Learnt from: jiridanek
PR: #1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml
file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run pipenv lock
inside UBI9 containers with specific platform arguments (--platform=linux/amd64 --python-version 3.12
) to avoid host OS dependency conflicts when generating Pipfile.lock files.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/requirements.txt:435-444
Timestamp: 2025-07-03T13:59:55.040Z
Learning: jiridanek requested GitHub issue creation for numpy/scipy compatibility investigation in PR #1269, specifically for cases where theoretical version conflicts don't manifest as actual build failures. Issue #1297 was created with comprehensive investigation framework, acceptance criteria, runtime testing approach, and proper context linking, demonstrating the pattern of creating investigation-type issues for apparent but non-blocking technical concerns that require deeper understanding.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-03T13:33:55.209Z
Learning: jiridanek requested GitHub issue creation for smart-open library S3/GCS backend compatibility concern in PR #1269. Issue #1296 was successfully created with comprehensive problem description, acceptance criteria, implementation guidance, testing approach, and proper context linking, following the established pattern of systematic issue tracking for technical improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-03T12:44:00.167Z
Learning: jiridanek requested GitHub issue creation for smart-open library S3/GCS backend compatibility concern in PR #1269. Issue #1289 was successfully created with comprehensive problem description, acceptance criteria, implementation guidance, testing approach, and proper context linking, following the established pattern of systematic issue tracking for technical improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-03T12:44:00.167Z
Learning: jiridanek requested GitHub issue creation for smart-open library compatibility concern in PR #1269, specifically about potential S3/GCS backend functionality loss due to smart-open dropping implicit extras since version 6.x. Issue creation follows established pattern of comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T12:31:19.195Z
Learning: jiridanek requested GitHub issue creation for CGI script error handling improvements in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. Issue #1286 was created with comprehensive problem description, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:102-102
Timestamp: 2025-07-03T16:26:18.718Z
Learning: jiridanek requested GitHub issue creation for improving hardcoded threshold constants in TrustyAI test notebook during PR #1306 review. Issue #1317 was created with comprehensive problem description covering both bias detection threshold (-0.15670061634672994) and fairness threshold (0.0036255104824703954), multiple solution options with code examples, clear acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. The request follows the established pattern of systematic code quality improvements with comprehensive issue creation covering problem description, solution details, acceptance criteria, implementation guidance, and proper context linking.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:1-5
Timestamp: 2025-07-03T14:08:35.749Z
Learning: jiridanek requested GitHub issue creation for CGI script strict mode improvements in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. Issue #1300 was created with comprehensive problem description covering silent failures, production risks, phased acceptance criteria for basic strict mode and enhanced error handling, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. Issue #1312 was successfully created with comprehensive problem description covering hard-coded URL limitations, timeout protection, error handling, acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:1-3
Timestamp: 2025-07-03T12:07:19.365Z
Learning: jiridanek consistently requests GitHub issue creation for technical improvements identified during code reviews in opendatahub-io/notebooks, ensuring systematic tracking of code quality enhancements like shell script portability issues with comprehensive descriptions, solution options, and acceptance criteria.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:20-40
Timestamp: 2025-07-03T12:27:09.739Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/run-code-server.sh create_dir_and_file function during PR #1269 review. Issue #1284 was created with comprehensive problem description, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
</retrieved_learning>
<retrieved_learning>
Learnt from: atheo89
PR: #1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:32-32
Timestamp: 2025-07-07T11:08:48.524Z
Learning: atheo89 requested GitHub issue creation for multi-architecture Dockerfile improvements during PR #1258 review, specifically for enhancing structural consistency across Docker stages, replacing
</retrieved_learning>
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py (2)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml (13)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm:43-55
Timestamp: 2025-07-01T06:48:21.070Z
Learning: When security concerns are raised during PR reviews in opendatahub-io/notebooks, comprehensive follow-up issues are created (often by CodeRabbit) to track all related security enhancements with clear acceptance criteria and implementation guidance. This ensures security improvements are systematically addressed in dedicated efforts rather than blocking current deliverables.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/minimal/ubi9-python-3.12/requirements.txt (21)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: The jupyter-bokeh pinning to 3.0.5 in TrustyAI notebook image was not due to TrustyAI code compatibility issues, but because the trustyai package itself explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency, causing pip dependency resolution conflicts when trying to upgrade to jupyter-bokeh 4.x.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: The jupyter-bokeh package was previously pinned to version 3.0.5 in the TrustyAI notebook image due to compatibility requirements with TrustyAI components, as indicated by the comment "Should be pinned down to this version in order to be compatible with trustyai" that was removed in this update.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/requirements.txt:435-444
Timestamp: 2025-07-03T13:59:55.040Z
Learning: jiridanek requested GitHub issue creation for numpy/scipy compatibility investigation in PR #1269, specifically for cases where theoretical version conflicts don't manifest as actual build failures. Issue #1297 was created with comprehensive investigation framework, acceptance criteria, runtime testing approach, and proper context linking, demonstrating the pattern of creating investigation-type issues for apparent but non-blocking technical concerns that require deeper understanding.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI 0.6.1 (latest version as of June 2025) has a hard dependency constraint on jupyter-bokeh~=3.0.5, preventing upgrades to jupyter-bokeh 4.x in notebook images that include TrustyAI. This requires either waiting for TrustyAI to update their dependency or excluding TrustyAI from jupyter-bokeh upgrades.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-26T16:21:12.994Z
Learning: When Playwright Docker image version in `.github/workflows/build-notebooks-TEMPLATE.yaml` doesn't match the `@playwright/test` version in `tests/browser/package.json`, it results in a runtime error: "Executable doesn't exist at /ms-playwright/webkit-2182/pw_run.sh" with a helpful message box showing current vs required versions (e.g., "current: mcr.microsoft.com/playwright:v1.52.0-noble" vs "required: mcr.microsoft.com/playwright:v1.53.1-noble").
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/datascience/ubi9-python-3.11/Pipfile:34-36
Timestamp: 2025-06-28T14:13:27.890Z
Learning: In the opendatahub-io/notebooks repository, the dependency pinning strategy follows a deliberate pattern: core `jupyterlab` package uses exact pinning (==) across all notebook images to ensure UI consistency, while JupyterLab extensions and all server components (jupyter-server, jupyter-server-proxy, jupyter-server-terminals) use compatible release (~=) pinning to allow automatic security updates and bug fixes while maintaining API compatibility.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1258
File: codeserver/ubi9-python-3.11/Dockerfile.cpu:55-56
Timestamp: 2025-07-03T08:22:25.348Z
Learning: jiridanek directs security concerns raised during PR reviews to existing comprehensive security issues rather than addressing them in individual PRs. Issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" serves as the central tracking issue for all binary download security concerns across the opendatahub-io/notebooks repository.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm:43-55
Timestamp: 2025-07-01T06:48:21.070Z
Learning: When security concerns are raised during PR reviews in opendatahub-io/notebooks, comprehensive follow-up issues are created (often by CodeRabbit) to track all related security enhancements with clear acceptance criteria and implementation guidance. This ensures security improvements are systematically addressed in dedicated efforts rather than blocking current deliverables.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1123
File: runtimes/tensorflow/ubi9-python-3.11/Pipfile:56-56
Timestamp: 2025-06-26T11:39:13.498Z
Learning: pip installing packages from PyPI typically downloads pre-built wheels that just get extracted - setuptools is not involved in this process. setuptools is primarily used for building packages from source (setup.py, pyproject.toml), creating entry points, and package discovery during build time, not for installing pre-built wheels.
runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py (17)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:20-40
Timestamp: 2025-07-03T12:27:09.739Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/run-code-server.sh create_dir_and_file function during PR #1269 review. Issue #1284 was created with comprehensive problem description, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:17-17
Timestamp: 2025-07-03T12:26:24.084Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for unquoted command substitution in codeserver/ubi9-python-3.12/run-code-server.sh. Issue #1283 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/utils/process.sh:1-2
Timestamp: 2025-07-03T16:00:46.191Z
Learning: jiridanek requested GitHub issue creation for shell script strict-mode improvement in codeserver/ubi9-python-3.12/utils/process.sh during PR #1269 review. Issue #1303 was created with comprehensive problem description covering silent failures and production risks, phased acceptance criteria for basic strict-mode implementation and enhanced error handling, implementation guidance with code examples and flag explanations, benefits section, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1298 was successfully created with comprehensive problem description covering race conditions, failure detection, orphaned processes, and signal handling, along with multiple solution options, phased acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:1-2
Timestamp: 2025-07-03T16:08:47.251Z
Learning: jiridanek requested GitHub issue creation for shell strict mode improvement in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. Issue #1310 was created with comprehensive problem description covering silent failures, production risks, implementation guidance with code examples, acceptance criteria, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements for the codeserver image entrypoint script.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:5-6
Timestamp: 2025-07-03T16:04:30.516Z
Learning: jiridanek requested GitHub issue creation for shell script strict mode improvement in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. Issue #1308 was created with comprehensive problem description covering limitations of `set -e`, detailed solution with `set -euo pipefail`, benefits explanation for each flag, acceptance criteria, risk assessment, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements for shell scripts.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:15:41.168Z
Learning: TrustyAI's jupyter-bokeh was pinned to 3.0.5 due to compatibility requirements with TrustyAI's visualization components, but the actual deployed version in requirements.txt shows 3.0.7, indicating incremental testing. The upgrade to 4.0.5 in this PR represents the completion of a gradual migration strategy from the 3.x series after confirming compatibility with Bokeh 3.7.3.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/requirements.txt:435-444
Timestamp: 2025-07-03T13:59:55.040Z
Learning: jiridanek requested GitHub issue creation for numpy/scipy compatibility investigation in PR #1269, specifically for cases where theoretical version conflicts don't manifest as actual build failures. Issue #1297 was created with comprehensive investigation framework, acceptance criteria, runtime testing approach, and proper context linking, demonstrating the pattern of creating investigation-type issues for apparent but non-blocking technical concerns that require deeper understanding.
runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml (11)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T16:04:22.695Z
Learning: jiridanek requested GitHub issue creation for shell script quality improvements in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh during PR #1269 review. Issue #1307 was created with comprehensive problem description covering variable scoping issues, POSIX compliance concerns, multiple solution options, acceptance criteria, implementation guidance with code examples, testing approaches, and proper context linking, continuing the established pattern of systematic code quality improvements.
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml (11)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi:6-6
Timestamp: 2025-07-03T16:17:05.475Z
Learning: jiridanek requested GitHub issue creation for CGI script health-check URL configurability and timeout improvement in codeserver/ubi9-python-3.12/nginx/api/kernels/access.cgi during PR #1269 review. The request follows the established pattern of systematic code quality improvements with comprehensive issue creation covering problem description, solution details, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:135-136
Timestamp: 2025-07-04T05:52:49.464Z
Learning: jiridanek requested GitHub issue creation for improving fragile sed-based Jupyter kernel display_name modification in jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu during PR #1306 review. Issue #1321 was created with comprehensive problem description covering JSON corruption risks, greedy regex patterns, maintenance burden, and proposed Python-based JSON parsing solution with detailed acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-76
Timestamp: 2025-07-04T06:04:43.085Z
Learning: jiridanek requested GitHub issue creation for duplicate CSV loading and validation problem in jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb during PR #1306 review. Issue #1322 was created with comprehensive problem description covering code redundancy, runtime failure risks, network inefficiency, and test reliability concerns, along with detailed solution including duplicate line removal, data validation implementation, repository-wide audit, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml (11)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-tensorflow-notebook-imagestream.yaml:45-45
Timestamp: 2025-06-13T08:34:01.300Z
Learning: When updating dependency versions in `manifests/base/*-imagestream.yaml`, the project convention is to modify only the newest tag (e.g., "2025.1") and intentionally leave earlier tags (e.g., "2024.2") unchanged.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt (7)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (12)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/minimal/ubi9-python-3.12/Dockerfile.rocm:43-55
Timestamp: 2025-07-01T06:48:21.070Z
Learning: When security concerns are raised during PR reviews in opendatahub-io/notebooks, comprehensive follow-up issues are created (often by CodeRabbit) to track all related security enhancements with clear acceptance criteria and implementation guidance. This ensures security improvements are systematically addressed in dedicated efforts rather than blocking current deliverables.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: grdryn
PR: opendatahub-io/notebooks#1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/Dockerfile.cpu:4-4
Timestamp: 2025-07-04T05:49:10.314Z
Learning: jiridanek directs base image pinning security concerns to existing comprehensive issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" rather than addressing them in individual PRs, continuing the established pattern of systematic security and quality tracking in opendatahub-io/notebooks.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py (8)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow with runtime ROCm configuration is the recommended alternative approach.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1247
File: .github/workflows/build-notebooks-TEMPLATE.yaml:50-53
Timestamp: 2025-07-01T14:36:52.852Z
Learning: In the opendatahub-io/notebooks repository, the test runner's Python version (configured in GitHub Actions UV setup) intentionally doesn't need to match the Python version of the container images being tested. jiridanek's team uses Python 3.12 for running tests while images may use different Python versions (like 3.11), and this approach works fine since the test code is separate from the application code running inside the containers.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:71-88
Timestamp: 2025-07-04T06:05:30.580Z
Learning: jiridanek requested GitHub issue creation for TrustyAI test notebook URL configurability and network error handling improvements during PR #1306 review. Issue #1323 was created with ⚠️ emoji in title for visibility, comprehensive problem description covering incorrect hardcoded URLs (pointing to Python 3.11 instead of 3.12), missing network error handling, maintenance burden, multiple solution options with code examples, phased acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (9)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: tensorflow_rocm package versions 2.12.1.570 through 2.14.0.600 do not provide Python 3.12 wheels (cp312) on PyPI, causing Pipfile lock failures when attempting to create Python 3.12-based ROCm TensorFlow notebook images in opendatahub-io/notebooks.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:05:33.329Z
Learning: tensorflow_rocm package has no Python 3.12 or 3.13 wheel support as of July 2025, with the latest version 2.14.0.600 only supporting Python 3.9, 3.10, and 3.11. The tensorflow-rocm upstream project appears abandoned with the last release in 2019. For Python 3.12+ ROCm TensorFlow environments, regular TensorFlow 2.18+ with runtime ROCm configuration is the recommended and industry-standard approach, as modern TensorFlow automatically detects and utilizes ROCm when properly installed.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml (12)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml:11-22
Timestamp: 2025-06-30T14:36:53.890Z
Learning: The pod-patch.yaml file in jupyter/pytorch/ubi9-python-3.12/kustomize/components/accelerator/ is used only for running tests, not production deployments. This affects the risk assessment for resource management configurations like sizeLimit on emptyDir volumes.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1230
File: jupyter/pytorch/ubi9-python-3.12/kustomize/base/statefulset.yaml:54-60
Timestamp: 2025-06-30T14:43:08.138Z
Learning: Issue #1212 in opendatahub-io/notebooks demonstrates that missing securityContext configuration (allowPrivilegeEscalation, runAsNonRoot, seccompProfile) causes runtime pods to fail reaching ready state and timeout after 300s on OpenShift due to PodSecurity policy violations.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: Runtime deployment tests in opendatahub-io/notebooks may show PodSecurity warnings about allowPrivilegeEscalation, capabilities, runAsNonRoot, and seccompProfile settings. These warnings occur on OpenShift but not on GitHub Actions because GitHub Actions uses upstream Kubernetes without SecurityContextConstraints (SCC).
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-tensorflow-notebook-imagestream.yaml:45-45
Timestamp: 2025-06-13T08:34:01.300Z
Learning: When updating dependency versions in `manifests/base/*-imagestream.yaml`, the project convention is to modify only the newest tag (e.g., "2025.1") and intentionally leave earlier tags (e.g., "2024.2") unchanged.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T07:11:44.903Z
Learning: Resource limits in StatefulSet manifests in opendatahub-io/notebooks are configured for testing purposes, not production deployments. This affects risk assessment when reviewing resource configurations like memory and CPU limits.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-nginx.sh:18-23
Timestamp: 2025-07-03T16:17:23.065Z
Learning: jiridanek requested GitHub issue creation for shell script variable quoting security concern in codeserver/ubi9-python-3.12/run-nginx.sh during PR #1269 review. The issue covers unquoted variables NB_PREFIX, NOTEBOOK_ARGS, and BASE_URL that pose security risks including command injection, word-splitting vulnerabilities, and globbing issues. A comprehensive issue was created with detailed problem description, security concerns, solution with code examples, acceptance criteria, implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh:4-11
Timestamp: 2025-07-03T12:08:47.691Z
Learning: jiridanek requests GitHub issue creation for shell script quality improvements identified during PR #1269 review, specifically for POSIX compliance and security issues in codeserver/ubi9-python-3.12/nginx/root/usr/share/container-scripts/nginx/common.sh. Issue #1275 was created with comprehensive problem descriptions, acceptance criteria, implementation guidance, and proper context linking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/run-code-server.sh:7-10
Timestamp: 2025-07-03T14:01:22.819Z
Learning: jiridanek requested GitHub issue creation for container startup robustness and lifecycle management improvements in codeserver/ubi9-python-3.12/run-code-server.sh during PR #1269 review. A comprehensive issue was created covering race conditions, failure detection, process lifecycle coupling, and signal handling with detailed problem descriptions, multiple solution options, phased acceptance criteria, testing approach, and proper context linking, following the established pattern of systematic code quality improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: OpenShift CI infrastructure issues in opendatahub-io/notebooks can manifest as "ContainerFailed one or more containers exited" errors in release steps, or as "Entrypoint received interrupt: terminated" messages when pods are killed during CI runs. These are typically infrastructure-level issues rather than code problems.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user:4-9
Timestamp: 2025-07-03T16:05:35.448Z
Learning: jiridanek requested GitHub issue creation for shell script error handling improvements in codeserver/ubi9-python-3.12/nginx/root/opt/app-root/etc/generate_container_user during PR #1269 review. A comprehensive issue was created covering silent failures, unquoted variable expansions, missing template validation, and strict mode implementation with detailed problem descriptions, phased acceptance criteria, implementation guidance with code examples, and proper context linking, continuing the established pattern of systematic code quality improvements.
runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt (6)
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue. The pattern is actively managed through issues like #1243 "Improve error handling in get_expected_version() functions across test notebooks" and #1254 "Fix undefined variable error in ROCm PyTorch Python 3.12 test notebook".
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1333
File: runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py:1-769
Timestamp: 2025-07-07T14:18:37.212Z
Learning: jiridanek requested GitHub issue creation for bootstrapper code duplication problem in runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py during PR #1333 review. A comprehensive issue was created with detailed problem description covering maintenance overhead and consistency risks from duplicate implementations across multiple runtime environments, four solution options (symlinks, import-based, template-based, direct shared import) with pros/cons analysis, clear acceptance criteria for consolidation and maintainability, step-by-step implementation guidance, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/test/test_notebook.ipynb:44-50
Timestamp: 2025-07-03T16:18:59.234Z
Learning: In the opendatahub-io/notebooks repository, the expected_versions.json file referenced in test notebooks (like test_notebook.ipynb) is generated during test execution, not included as a static file in the PR. This is an established testing pattern that should not be flagged as a missing file issue.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1218
File: jupyter/trustyai/ubi9-python-3.11/Pipfile:49-49
Timestamp: 2025-06-28T14:21:09.429Z
Learning: TrustyAI explicitly declares jupyter-bokeh~=3.0.5 as a hard dependency in both their requirements.txt and pyproject.toml files, with no open issues tracking jupyter-bokeh 4.x compatibility. This creates an unresolvable pip dependency conflict when trying to upgrade jupyter-bokeh to 4.x in notebook images that include TrustyAI.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: jiridanek's team uses containerized dependency locking for cross-platform compatibility in opendatahub-io/notebooks. They run `pipenv lock` inside UBI9 containers with specific platform arguments (`--platform=linux/amd64 --python-version 3.12`) to avoid host OS dependency conflicts when generating Pipfile.lock files.
🧬 Code Graph Analysis (1)
runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py (6)
runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py (26)
FileOpBase
(54-355)get_instance
(62-72)NotebookFileOp
(358-465)PythonFileOp
(468-512)RFileOp
(515-558)convert_param_str_to_dict
(338-348)execute
(110-112)execute
(361-395)execute
(471-512)execute
(518-558)process_dependencies
(114-137)OpUtil
(561-741)log_operation_info
(725-741)get_file_from_object_storage
(287-299)process_outputs
(139-155)process_output_file
(324-336)process_metrics_and_metadata
(157-277)get_object_storage_filename
(279-285)put_file_to_object_storage
(301-318)has_wildcard
(320-322)set_parameters_in_env
(350-355)package_install
(565-615)determine_elyra_requirements
(618-626)package_list_to_dict
(629-653)parse_arguments
(656-722)main
(744-765)runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py (26)
FileOpBase
(54-355)get_instance
(62-72)NotebookFileOp
(358-465)PythonFileOp
(468-512)RFileOp
(515-558)convert_param_str_to_dict
(338-348)execute
(110-112)execute
(361-395)execute
(471-512)execute
(518-558)process_dependencies
(114-137)OpUtil
(561-741)log_operation_info
(725-741)get_file_from_object_storage
(287-299)process_outputs
(139-155)process_output_file
(324-336)process_metrics_and_metadata
(157-277)get_object_storage_filename
(279-285)put_file_to_object_storage
(301-318)has_wildcard
(320-322)set_parameters_in_env
(350-355)package_install
(565-615)determine_elyra_requirements
(618-626)package_list_to_dict
(629-653)parse_arguments
(656-722)main
(744-765)runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py (26)
FileOpBase
(54-355)get_instance
(62-72)NotebookFileOp
(358-465)PythonFileOp
(468-512)RFileOp
(515-558)convert_param_str_to_dict
(338-348)execute
(110-112)execute
(361-395)execute
(471-512)execute
(518-558)process_dependencies
(114-137)OpUtil
(561-741)log_operation_info
(725-741)get_file_from_object_storage
(287-299)process_outputs
(139-155)process_output_file
(324-336)process_metrics_and_metadata
(157-277)get_object_storage_filename
(279-285)put_file_to_object_storage
(301-318)has_wildcard
(320-322)set_parameters_in_env
(350-355)package_install
(565-615)determine_elyra_requirements
(618-626)package_list_to_dict
(629-653)parse_arguments
(656-722)main
(744-765)runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py (26)
FileOpBase
(54-355)get_instance
(62-72)NotebookFileOp
(358-465)PythonFileOp
(468-512)RFileOp
(515-558)convert_param_str_to_dict
(338-348)execute
(110-112)execute
(361-395)execute
(471-512)execute
(518-558)process_dependencies
(114-137)OpUtil
(561-741)log_operation_info
(725-741)get_file_from_object_storage
(287-299)process_outputs
(139-155)process_output_file
(324-336)process_metrics_and_metadata
(157-277)get_object_storage_filename
(279-285)put_file_to_object_storage
(301-318)has_wildcard
(320-322)set_parameters_in_env
(350-355)package_install
(565-615)determine_elyra_requirements
(618-626)package_list_to_dict
(629-653)parse_arguments
(656-722)main
(744-765)runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (22)
FileOpBase
(54-355)get_instance
(62-72)NotebookFileOp
(358-465)PythonFileOp
(468-512)RFileOp
(515-558)convert_param_str_to_dict
(338-348)process_dependencies
(114-137)OpUtil
(561-741)log_operation_info
(725-741)get_file_from_object_storage
(287-299)process_outputs
(139-155)process_output_file
(324-336)process_metrics_and_metadata
(157-277)get_object_storage_filename
(279-285)put_file_to_object_storage
(301-318)has_wildcard
(320-322)set_parameters_in_env
(350-355)package_install
(565-615)determine_elyra_requirements
(618-626)package_list_to_dict
(629-653)parse_arguments
(656-722)main
(744-765)tests/containers/socket_proxy.py (1)
run
(156-168)
🪛 Ruff (0.11.9)
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/pytorch/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/rocm-pytorch/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/datascience/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/minimal/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/bootstrapper.py
29-29: typing.Dict
is deprecated, use dict
instead
(UP035)
31-31: typing.Type
is deprecated, use type
instead
(UP035)
62-62: Use type
instead of Type
for type annotation
Replace with type
(UP006)
76-76: import
should be at the top-level of a file
(PLC0415)
77-77: import
should be at the top-level of a file
(PLC0415)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
228-228: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
232-232: Use explicit conversion flag
Replace with conversion flag
(RUF010)
301-301: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use X | None
for type annotations
Convert to X | None
(UP045)
338-338: Use dict
instead of Dict
for type annotation
Replace with dict
(UP006)
378-378: import
should be at the top-level of a file
(PLC0415)
405-405: import
should be at the top-level of a file
(PLC0415)
406-406: import
should be at the top-level of a file
(PLC0415)
412-412: Unpacked variable resources
is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
432-432: import
should be at the top-level of a file
(PLC0415)
433-433: import
should be at the top-level of a file
(PLC0415)
448-448: Loop control variable name
not used within loop body
Rename unused name
to _name
(B007)
448-448: When using only the values of a dict use the values()
method
Replace .items()
with .values()
(PERF102)
474-474: Local variable python_script_name
is assigned to but never used
Remove assignment to unused variable python_script_name
(F841)
493-493: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
521-521: Local variable r_script_name
is assigned to but never used
Remove assignment to unused variable r_script_name
(F841)
538-538: Single quotes found but double quotes preferred
Replace single quotes with double quotes
(Q000)
561-561: Class OpUtil
inherits from object
Remove object
inheritance
(UP004)
608-608: Consider iterable unpacking instead of concatenation
Replace with iterable unpacking
(RUF005)
613-613: subprocess.run
without explicit check
argument
Add explicit check=False
(PLW1510)
640-640: Call startswith
once with a tuple
Merge into a single startswith
call
(PIE810)
657-657: import
should be at the top-level of a file
(PLC0415)
659-659: Using the global statement to update pipeline_name
is discouraged
(PLW0603)
659-659: Using the global statement to update operation_name
is discouraged
(PLW0603)
725-725: Use X | None
for type annotations
Convert to X | None
(UP045)
738-738: Using global for pipeline_name
but no assignment is done
(PLW0602)
738-738: Using global for operation_name
but no assignment is done
(PLW0602)
769-769: No newline at end of file
Add trailing newline
(W292)
🪛 Checkov (3.2.334)
runtimes/datascience/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/minimal/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/pytorch/ubi9-python-3.12/kustomize/components/accelerator/pod-patch.yaml
[MEDIUM] 2-20: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-20: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/pytorch/ubi9-python-3.12/kustomize/overlays/accelerator/cuda/pod-patch.yaml
[MEDIUM] 2-13: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-13: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/rocm-pytorch/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/rocm-tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
runtimes/tensorflow/ubi9-python-3.12/kustomize/base/pod.yaml
[MEDIUM] 2-22: Containers should not run with allowPrivilegeEscalation
(CKV_K8S_20)
[MEDIUM] 2-22: Minimize the admission of root containers
(CKV_K8S_23)
🪛 LanguageTool
runtimes/datascience/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/minimal/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/rocm-pytorch/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/rocm-tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
runtimes/tensorflow/ubi9-python-3.12/utils/requirements-elyra.txt
[grammar] ~6-~6: Consider using either the past participle “needed” or the present participle “needing” here.
Context: ...kfp.py#L1099, # in case the details are need please follow the comprehensive list of...
(BEEN_PART_AGREEMENT)
🪛 GitHub Check: Trivy
runtimes/minimal/ubi9-python-3.12/requirements.txt
[failure] 289-289: Jupyter Core on Windows Has Uncontrolled Search Path Element Local Privilege Escalation Vulnerability
Package: jupyter-core
Installed Version: 5.7.2
Vulnerability CVE-2025-30167
Severity: HIGH
Fixed Version: 5.8.1
Link: CVE-2025-30167
🔇 Additional comments (4)
runtimes/minimal/ubi9-python-3.12/Dockerfile.cpu (3)
4-4
: Base image pinning concern tracked in existing issue.The
:latest
tag usage violates Docker best practices for reproducible builds and security. This concern is already tracked in issue #1242 "Improve Docker FROM image versioning by avoiding :latest tags" for systematic resolution across the codebase.
28-31
: Binary checksum verification tracked in existing issue.The oc client binary is downloaded without checksum verification, which poses a security risk. This concern is already covered by issue #1241 "Security: Add checksum verification for downloaded binaries in Python 3.12 images" for comprehensive resolution across all runtime images.
12-19
: LGTM! Architecture-specific package handling is well implemented.The conditional installation of development tools for s390x architecture demonstrates good awareness of platform-specific requirements while keeping the image minimal for other architectures.
runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py (1)
1-769
: Duplicate bootstrapper consolidation already tracked – no further changes neededThe bootstrapper code in runtimes/tensorflow/ubi9-python-3.12/utils/bootstrapper.py is part of the known duplication across all runtime directories and is already being consolidated under the existing issue created during PR #1333 review. No additional refactoring is required in this PR.
I'm seeing some of
already tracked in https://issues.redhat.com/browse/RHOAIENG-28824 otherwise looks good. |
@coderabbitai full review |
✅ Actions performedFull review triggered. |
@coderabbitai full |
Tip For best results, initiate chat on the files or code changes. @jiridanek ✅ Actions performed Full review triggered. 🧠 Learnings used
|
Tip For best results, initiate chat on the files or code changes. @jiridanek, I'll conduct a comprehensive review of this Python 3.12 runtime images PR. Overall AssessmentThis is a well-structured and comprehensive implementation that systematically adds Python 3.12 support across all runtime image variants. The approach maintains consistency with existing Python 3.11 patterns while properly updating all necessary components. Key Strengths✅ Comprehensive Coverage: Includes all runtime variants (minimal, datascience, pytorch, tensorflow, ROCm variants) Areas for Improvement1. Code Duplication (Known Pattern)The bootstrapper files across different runtime environments are largely identical. This continues the established pattern from the Elyra project, but creates maintenance overhead. As noted in the learnings, this affects all runtime bootstrappers. 2. Dependency Management
3. Configuration Consistency
Technical ImplementationDockerfiles
Kubernetes Manifests
Python Dependencies
Security Considerations✅ User Context: Proper switching between root and non-root users Recommendations
ConclusionThis is a solid implementation that maintains consistency with existing patterns while properly extending support to Python 3.12. The known dependency issue is already tracked and doesn't block the core functionality. The systematic approach and comprehensive coverage make this a valuable addition to the project. Recommendation: Approve after confirming testing coverage for all runtime variants. 🧠 Learnings used
|
Fixes for https://issues.redhat.com/browse/RHOAIENG-28583?filter=-1
Description
Create Runtime Images for Python 3.12
How Has This Been Tested?
n/a
Merge criteria:
Summary by CodeRabbit
New Features
Chores