Historique des commits

Auteur SHA1 Message Date
  AUTOMATIC1111 806ea639e6 Merge pull request #11066 from aljungberg/patch-1 il y a 2 ans
  Alexander Ljungberg d9cc0910c8 Fix upcast attention dtype error. il y a 2 ans
  AUTOMATIC1111 56bf522913 Merge pull request #10990 from vkage/sd_hijack_optimizations_bugfix il y a 2 ans
  AUTOMATIC 2e23c9c568 fix the broken line for #10990 il y a 2 ans
  Vivek K. Vasishtha b1a72bc7e2 torch.cuda.is_available() check for SdOptimizationXformers il y a 2 ans
  AUTOMATIC 3ee1238630 revert default cross attention optimization to Doggettx il y a 2 ans
  AUTOMATIC 36888092af revert default cross attention optimization to Doggettx il y a 2 ans
  AUTOMATIC 05933840f0 rename print_error to report, use it with together with package name il y a 2 ans
  Aarni Koskela 00dfe27f59 Add & use modules.errors.print_error where currently printing exception info by hand il y a 2 ans
  Aarni Koskela df004be2fc Add a couple `from __future__ import annotations`es for Py3.9 compat il y a 2 ans
  AUTOMATIC1111 1e5afd4fa9 Apply suggestions from code review il y a 2 ans
  AUTOMATIC 8a3d232839 fix linter issues il y a 2 ans
  AUTOMATIC 2582a0fd3b make it possible for scripts to add cross attention optimizations il y a 2 ans
  Aarni Koskela 49a55b410b Autofix Ruff W (not W605) (mostly whitespace) il y a 2 ans
  AUTOMATIC 028d3f6425 ruff auto fixes il y a 2 ans
  AUTOMATIC 762265eab5 autofixes from ruff il y a 2 ans
  brkirch 7aab389d6f Fix for Unet NaNs il y a 2 ans
  FNSpd 280ed8f00f Update sd_hijack_optimizations.py il y a 2 ans
  FNSpd c84c9df737 Update sd_hijack_optimizations.py il y a 2 ans
  Pam 8d7fa2f67c sdp_attnblock_forward hijack il y a 2 ans
  Pam 37acba2633 argument to disable memory efficient for sdp il y a 2 ans
  Pam fec0a89511 scaled dot product attention il y a 2 ans
  brkirch e3b53fd295 Add UI setting for upcasting attention to float32 il y a 2 ans
  AUTOMATIC 59146621e2 better support for xformers flash attention on older versions of torch il y a 2 ans
  Takuma Mori 3262e825cc add --xformers-flash-attention option & impl il y a 2 ans
  AUTOMATIC 40ff6db532 extra networks UI il y a 2 ans
  brkirch c18add68ef Added license il y a 2 ans
  brkirch b95a4c0ce5 Change sub-quad chunk threshold to use percentage il y a 2 ans
  brkirch d782a95967 Add Birch-san's sub-quadratic attention implementation il y a 2 ans
  brkirch 35b1775b32 Use other MPS optimization for large q.shape[0] * q.shape[1] il y a 2 ans