C43H66N12O12S2
|
cc0258aea7
check for ampere without destroying the optimizations. again.
|
2 anni fa |
C43H66N12O12S2
|
017b6b8744
check for ampere
|
2 anni fa |
AUTOMATIC
|
cfc33f99d4
why did you do this
|
2 anni fa |
AUTOMATIC
|
27032c47df
restore old opt_split_attention/disable_opt_split_attention logic
|
2 anni fa |
AUTOMATIC
|
dc1117233e
simplify xfrmers options: --xformers to enable and that's it
|
2 anni fa |
AUTOMATIC1111
|
48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
|
2 anni fa |
C43H66N12O12S2
|
970de9ee68
Update sd_hijack.py
|
2 anni fa |
C43H66N12O12S2
|
26b459a379
default to split attention if cuda is available and xformers is not
|
2 anni fa |
MrCheeze
|
5f85a74b00
fix bug where when using prompt composition, hijack_comments generated before the final AND will be dropped
|
2 anni fa |
AUTOMATIC
|
77f4237d1c
fix bugs related to variable prompt lengths
|
2 anni fa |
AUTOMATIC
|
4999eb2ef9
do not let user choose his own prompt token count limit
|
2 anni fa |
AUTOMATIC
|
706d5944a0
let user choose his own prompt token count limit
|
2 anni fa |
C43H66N12O12S2
|
91d66f5520
use new attnblock for xformers path
|
2 anni fa |
C43H66N12O12S2
|
b70eaeb200
delete broken and unnecessary aliases
|
2 anni fa |
AUTOMATIC
|
12c4d5c6b5
hypernetwork training mk1
|
2 anni fa |
AUTOMATIC
|
f7c787eb7c
make it possible to use hypernetworks without opt split attention
|
2 anni fa |
C43H66N12O12S2
|
5e3ff846c5
Update sd_hijack.py
|
2 anni fa |
C43H66N12O12S2
|
5303df2428
Update sd_hijack.py
|
2 anni fa |
C43H66N12O12S2
|
35d6b23162
Update sd_hijack.py
|
2 anni fa |
C43H66N12O12S2
|
2eb911b056
Update sd_hijack.py
|
2 anni fa |
Jairo Correa
|
ad0cc85d1f
Merge branch 'master' into stable
|
2 anni fa |
AUTOMATIC
|
88ec0cf557
fix for incorrect embedding token length calculation (will break seeds that use embeddings, you're welcome!)
|
2 anni fa |
AUTOMATIC
|
820f1dc96b
initial support for training textual inversion
|
2 anni fa |
Jairo Correa
|
ad1fbbae93
Merge branch 'master' into fix-vram
|
2 anni fa |
AUTOMATIC
|
98cc6c6e74
add embeddings dir
|
2 anni fa |
AUTOMATIC
|
c715ef04d1
fix for incorrect model weight loading for #814
|
2 anni fa |
AUTOMATIC
|
c1c27dad3b
new implementation for attention/emphasis
|
2 anni fa |
Jairo Correa
|
c2d5b29040
Move silu to sd_hijack
|
2 anni fa |
Liam
|
e5707b66d6
switched the token counter to use hidden buttons instead of api call
|
2 anni fa |
Liam
|
5034f7d759
added token counter next to txt2img and img2img prompts
|
2 anni fa |