Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build(deps): bump flash-attn from 2.5.9.post1 to 2.6.1 #45

Merged
merged 1 commit into from
Jul 19, 2024

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jul 15, 2024

Bumps flash-attn from 2.5.9.post1 to 2.6.1.

Commits
  • 7551202 Bump to v2.6.1
  • 844912d [CI] Switch from CUDA 12.2 to 12.3
  • 40e534a Implement cache_leftpad
  • 116b05f [CI] Compile with pytorch 2.4.0.dev20240514
  • da11d1b Bump v2.6.0
  • d0787ac Relax dropout_fraction test
  • dca6d89 Don't support softcap and dropout at the same time
  • 81e01ef More typo fixes
  • 72e27c6 Fix typo with softcapping
  • 3d41db3 Only test backward if there's no softcapping
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Jul 15, 2024
@dependabot dependabot bot force-pushed the dependabot/pip/flash-attn-2.6.1 branch 2 times, most recently from c8d0d0d to b9a2bf2 Compare July 18, 2024 16:50
@dtrifiro dtrifiro force-pushed the dependabot/pip/flash-attn-2.6.1 branch from b9a2bf2 to 41f74f3 Compare July 19, 2024 10:54
@codecov-commenter
Copy link

codecov-commenter commented Jul 19, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 63.03%. Comparing base (5618c0d) to head (7ba8a34).

Additional details and impacted files
@@           Coverage Diff           @@
##             main      #45   +/-   ##
=======================================
  Coverage   63.03%   63.03%           
=======================================
  Files          18       18           
  Lines        1285     1285           
  Branches      228      228           
=======================================
  Hits          810      810           
  Misses        398      398           
  Partials       77       77           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@dtrifiro dtrifiro enabled auto-merge July 19, 2024 16:15
Bumps [flash-attn](https://github.com/Dao-AILab/flash-attention) from 2.5.9.post1 to 2.6.1.
- [Release notes](https://github.com/Dao-AILab/flash-attention/releases)
- [Commits](Dao-AILab/flash-attention@v2.5.9.post1...v2.6.1)

---
updated-dependencies:
- dependency-name: flash-attn
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dtrifiro dtrifiro force-pushed the dependabot/pip/flash-attn-2.6.1 branch from 41f74f3 to 7ba8a34 Compare July 19, 2024 16:23
@dtrifiro dtrifiro added this pull request to the merge queue Jul 19, 2024
Merged via the queue into main with commit 76cce9d Jul 19, 2024
3 checks passed
@dtrifiro dtrifiro deleted the dependabot/pip/flash-attn-2.6.1 branch July 19, 2024 16:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants