Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge from main #95

Merged
merged 76 commits into from
Jul 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
76 commits
Select commit Hold shift + click to select a range
375c676
Added model download script for mixtral
anandhu-eng Jun 23, 2024
cd20e20
added script to get mictral dataset
anandhu-eng Jun 24, 2024
4c090b6
Made dataset download for linux and windows working
anandhu-eng Jun 24, 2024
fc99336
Merge branch 'GATEOverflow:mlperf-inference' into getDatasetMistral
anandhu-eng Jun 24, 2024
92a005a
Delete script/get-dataset-mixtral/README.md
anandhu-eng Jun 24, 2024
ae3d7f7
Delete script/get-ml-model-mixtral/README.md
anandhu-eng Jun 24, 2024
967aa49
Fixed rclone issue
anandhu-eng Jun 24, 2024
ad995f5
Added dep for rclone
anandhu-eng Jun 24, 2024
02a5d26
Dependency call made conditional
anandhu-eng Jun 24, 2024
47143ba
Merge branch 'mlperf-inference' into getDatasetMistral
arjunsuresh Jun 24, 2024
0d93025
linked cm script for mixtral
anandhu-eng Jun 24, 2024
cceb22a
updated accuracy script for mixtral
anandhu-eng Jun 24, 2024
2d6e4c6
pipeline integrated - Tested on Linux only
anandhu-eng Jun 24, 2024
5e11956
deleted temp generated files
anandhu-eng Jun 24, 2024
676339e
Corrected model naming
anandhu-eng Jun 24, 2024
a4d7a32
Merge pull request #1 from anandhu-eng/modelRunTestMixtralMergeconfres
anandhu-eng Jun 24, 2024
ec27bac
Fixes for intel mlperf inference R50, performance run works now
arjunsuresh Jun 25, 2024
9a7dccb
Merge branch 'mlperf-inference' into modelRunTestMixtral
arjunsuresh Jun 25, 2024
55fc253
Merge pull request #77 from anandhu-eng/modelRunTestMixtral
arjunsuresh Jun 25, 2024
d7e1821
Merge pull request #76 from anandhu-eng/getDatasetMistral
arjunsuresh Jun 25, 2024
07c42f7
Merge branch 'mlcommons:mlperf-inference' into mlperf-inference
arjunsuresh Jun 25, 2024
a07fbee
Merge pull request #85 from mlcommons/dev
gfursin Jun 25, 2024
781968a
Use full dataset for imagenet mount
arjunsuresh Jun 25, 2024
e4e877c
Merge branch 'mlcommons:mlperf-inference' into mlperf-inference
arjunsuresh Jun 25, 2024
60a9ba9
Fix formatting of process-mlperf-accuracy
arjunsuresh Jun 25, 2024
12aac48
Fixed batch file taking invalid link rclone
anandhu-eng Jun 25, 2024
99f04ce
Merge branch 'GATEOverflow:mlperf-inference' into fixGithubActionError
anandhu-eng Jun 25, 2024
882f7d2
handled path to run.sh
anandhu-eng Jun 25, 2024
8dd56e4
fixed f-string mismatched
anandhu-eng Jun 25, 2024
f7d47be
remove docker - ubuntu gh
anandhu-eng Jun 25, 2024
7b63af2
remove temp generated files
anandhu-eng Jun 25, 2024
12a5227
Support mlperf inference 4.1 code base
arjunsuresh Jun 25, 2024
e71164e
Fix rclone local install on unix
arjunsuresh Jun 25, 2024
9d27d6e
Fix rclone local install on unix
arjunsuresh Jun 25, 2024
19f9299
Fix model path for llama2
arjunsuresh Jun 26, 2024
347fa9a
cleanups
arjunsuresh Jun 26, 2024
f533592
Merge pull request #87 from GATEOverflow/mlperf-inference
arjunsuresh Jun 26, 2024
eb08d57
Run ABTF POC test on pull request branch
arjunsuresh Jun 26, 2024
588cd04
Path appended rather than overwriting
anandhu-eng Jun 26, 2024
9198a4a
clean temp files
anandhu-eng Jun 26, 2024
0512601
Merge branch 'mlperf-inference' into fixGithubActionError
anandhu-eng Jun 26, 2024
69bd060
fixed git conflict
anandhu-eng Jun 26, 2024
e1ca12b
removed temporary generated files
anandhu-eng Jun 26, 2024
f8ee44b
added gcc tag and handled filenam bat issue
anandhu-eng Jun 26, 2024
e6f0853
added windows gh action
anandhu-eng Jun 26, 2024
0c25b88
Merge branch 'mlcommons:mlperf-inference' into mlperf-inference
arjunsuresh Jun 26, 2024
1dd89b8
modified windows gh action
anandhu-eng Jun 26, 2024
7986561
reverted gcc tag
anandhu-eng Jun 26, 2024
e5702dc
Merge branch 'mlperf-inference' into fixGithubActionError
arjunsuresh Jun 26, 2024
098785e
Merge pull request #78 from anandhu-eng/fixGithubActionError
arjunsuresh Jun 26, 2024
28677ef
Fix typo in ipex install
arjunsuresh Jun 27, 2024
cf463e8
Added intel mlperf inference retinanet changes
arjunsuresh Jun 28, 2024
6a5e6bf
fixes to intel mlperf inference retinanet
arjunsuresh Jun 28, 2024
1fba307
Fix retinanet deps for intel mlperf inference
arjunsuresh Jun 28, 2024
af02a2c
Fixes for intel mlperf inference retinanet
arjunsuresh Jun 29, 2024
47ed8c5
Fixes for intel mlperf inference retinanet
arjunsuresh Jun 29, 2024
4ccbd86
Fixes to retinanet intel mlperf inference (build now works)
arjunsuresh Jun 29, 2024
f3f104a
Fixes to retinanet intel mlperf inference dataset path (run now works…
arjunsuresh Jun 30, 2024
fbe63eb
Fixes the run configs for retinanet intel mlperf inference
arjunsuresh Jun 30, 2024
cfef1ab
Use -n for git cherry-pick
arjunsuresh Jun 30, 2024
ecbf449
Fix - selecting multiple default variation under the same group
anandhu-eng Jun 30, 2024
ea6dd22
Merge branch 'GATEOverflow:mlperf-inference' into nvidia-stable-diffu…
anandhu-eng Jun 30, 2024
ecdb5af
Merge pull request #81 from anandhu-eng/nvidia-stable-diffusion
arjunsuresh Jun 30, 2024
d6fe2e9
Cleanups for intel conda env
arjunsuresh Jun 30, 2024
54b898f
Fix the mlcommons_loadgen build path
arjunsuresh Jun 30, 2024
253dd5b
Fix deps for intel mlperf-inference-retinanet
arjunsuresh Jun 30, 2024
3f173cb
Merge pull request #91 from GATEOverflow/mlperf-inference
arjunsuresh Jul 1, 2024
2211f4a
Fix a possibility of env corruption for get-dataset-openimages
arjunsuresh Jul 1, 2024
9ac1fab
Added Intel mlperf inference 3d-unet (WIP)
arjunsuresh Jul 1, 2024
0f344b7
Fixes for intel mlperf inference 3d-unet (run_configs added)
arjunsuresh Jul 1, 2024
300b83f
Merge branch 'mlcommons:mlperf-inference' into mlperf-inference
arjunsuresh Jul 2, 2024
3826598
Final fixes for intel mperf inference 3d-unet
arjunsuresh Jul 2, 2024
e888ddc
Added better default values for test QPS for large mlperf inference m…
arjunsuresh Jul 2, 2024
58b1bd7
Merge pull request #93 from GATEOverflow/mlperf-inference
arjunsuresh Jul 2, 2024
61440ca
Merge branch 'main' into mlperf-inference
arjunsuresh Jul 2, 2024
b18ff89
Merge pull request #94 from anandhu-eng/mlperf-inference
arjunsuresh Jul 2, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 29 additions & 3 deletions .github/workflows/test-mlperf-inference-abtf-poc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
name: MLPerf inference ABTF POC Test

on:
pull_request_target:
pull_request:
branches: [ "main", "mlperf-inference" ]
paths:
- '.github/workflows/test-mlperf-inference-abtf-poc.yml'
Expand Down Expand Up @@ -37,7 +37,7 @@ jobs:
cm pull repo mlcommons@cm4abtf --branch=poc
- name: Test MLPerf Inference ABTF POC using ${{ matrix.backend }} on docker
run: |
cm run script --tags=run-abtf,inference,_poc-demo --adr.compiler.tags=gcc --quiet --docker --docker_it=no -v --gh_token=${{ secrets.ABTF_ACCESS_TOKEN }}
cm run script --tags=run-abtf,inference,_poc-demo --adr.compiler.tags=gcc --quiet -v

build2:
runs-on: ${{ matrix.os }}
Expand Down Expand Up @@ -65,5 +65,31 @@ jobs:
cm pull repo mlcommons@cm4abtf --branch=poc
- name: Test MLPerf Inference ABTF POC using ${{ matrix.backend }} on ${{ matrix.os }}
run: |
cm run script --tags=run-abtf,inference,_poc-demo --adr.compiler.tags=gcc --quiet -v --gh_token=${{ secrets.ABTF_ACCESS_TOKEN }}
cm run script --tags=run-abtf,inference,_poc-demo --adr.compiler.tags=gcc --quiet -v

build3:
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: [windows-latest]
python-version: [ "3.8", "3.12" ]
backend: [ "pytorch" ]
implementation: [ "python" ]
exclude:
- python-version: "3.8"

steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python3 -m pip install cmind
cm pull repo --url=${{ github.event.pull_request.head.repo.html_url }} --checkout=${{ github.event.pull_request.head.ref }}
cm pull repo mlcommons@cm4abtf --branch=poc
- name: Test MLPerf Inference ABTF POC using ${{ matrix.backend }} on ${{ matrix.os }}
run: |
cm run script --tags=run-abtf,inference,_poc-demo --quiet --env.CM_MLPERF_LOADGEN_BUILD_FROM_SRC=off -v
8 changes: 6 additions & 2 deletions automation/script/module.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# as portable and reusable automation recipes with simple tags, native scripts
# and a unified CLI, Python API and JSON/YAML meta descriptions.
#
# This is a stable prototype of the CM script automation
# This is a stable prototype of the CM script automation being developed by Grigori Fursin and Arjun Suresh
#
# TBD: when we have bandwidth and resources, we should refactor it
# and make it cleaner and simpler while keeping full backwards compatibility.
Expand Down Expand Up @@ -4443,7 +4443,11 @@ def update_env_with_values(env, fail_on_not_found=False):
if tmp_value not in env and fail_on_not_found:
return {'return':1, 'error':'variable {} is not in env'.format(tmp_value)}
if tmp_value in env:
value = value.replace("<<<"+tmp_value+">>>", str(env[tmp_value]))
if type(value) == str:
value = value.replace("<<<"+tmp_value+">>>", str(env[tmp_value]))
elif type(value) == list:
for i,val in enumerate(value):
value[i] = value[i].replace("<<<"+tmp_value+">>>", str(env[tmp_value]))

env[key] = value

Expand Down
41 changes: 41 additions & 0 deletions cm-repro/cm-run-script-input.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
{
"action": "run",
"automation": "script",
"tags": "run-mlperf,inference,_find-performance,_full",
"model": "mixtral-8x7b-99",
"implementation": "reference",
"framework": "pytorch",
"category": "edge",
"scenario": "Offline",
"execution_mode": "test",
"device": "cuda",
"test_query_count": "100",
"adr": {
"cuda": {
"version": "12.4.1"
}
},
"quiet": true,
"repro": true,
"cmd": [
"--tags=run-mlperf,inference,_find-performance,_full",
"--model=mixtral-8x7b-99",
"--implementation=reference",
"--framework=pytorch",
"--category=edge",
"--scenario=Offline",
"--execution_mode=test",
"--device=cuda",
"--test_query_count=100",
"--adr.cuda.version=12.4.1",
"--quiet",
"--repro"
],
"out": "con",
"parsed_automation": [
[
"script",
"5b4e0237da074764"
]
]
}
Loading
Loading