Commit History

Author SHA1 Message Date
  AUTOMATIC 2b91251637 removed aesthetic gradients as built-in 2 years ago
  AUTOMATIC 9286fe53de make aestetic embedding ciompatible with prompts longer than 75 tokens 2 years ago
  AUTOMATIC 7d6b388d71 Merge branch 'ae' 2 years ago
  C43H66N12O12S2 73b5dbf72a Update sd_hijack.py 2 years ago
  C43H66N12O12S2 786ed49922 use legacy attnblock 2 years ago
  MalumaDev 9324cdaa31 ui fix, re organization of the code 2 years ago
  MalumaDev e4f8b5f00d ui fix 2 years ago
  MalumaDev 523140d780 ui fix 2 years ago
  MalumaDev b694bba39a Merge remote-tracking branch 'origin/test_resolve_conflicts' into test_resolve_conflicts 2 years ago
  MalumaDev 9325c85f78 fixed dropbox update 2 years ago
  MalumaDev 97ceaa23d0 Merge branch 'master' into test_resolve_conflicts 2 years ago
  C43H66N12O12S2 529afbf4d7 Update sd_hijack.py 2 years ago
  MalumaDev 37d7ffb415 fix to tokens lenght, addend embs generator, add new features to edit the embedding before the generation using text 2 years ago
  MalumaDev bb57f30c2d init 2 years ago
  AUTOMATIC 429442f4a6 fix iterator bug for #2295 2 years ago
  hentailord85ez 80f3cf2bb2 Account when lines are mismatched 2 years ago
  brkirch 98fd5cde72 Add check for psutil 2 years ago
  brkirch c0484f1b98 Add cross-attention optimization from InvokeAI 2 years ago
  AUTOMATIC 873efeed49 rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have 2 years ago
  AUTOMATIC 5de806184f Merge branch 'master' into hypernetwork-training 2 years ago
  hentailord85ez 5e2627a1a6 Comma backtrack padding (#2192) 2 years ago
  C43H66N12O12S2 623251ce2b allow pascal onwards 2 years ago
  hentailord85ez d5c14365fd Add back in output hidden states parameter 2 years ago
  hentailord85ez 460bbae587 Pad beginning of textual inversion embedding 2 years ago
  hentailord85ez b340439586 Unlimited Token Works 2 years ago
  Fampai 1824e9ee3a Removed unnecessary tmp variable 2 years ago
  Fampai ad3ae44108 Updated code for legibility 2 years ago
  Fampai e59c66c008 Optimized code for Ignoring last CLIP layers 2 years ago
  Fampai 1371d7608b Added ability to ignore last n layers in FrozenCLIPEmbedder 2 years ago
  AUTOMATIC 3061cdb7b6 add --force-enable-xformers option and also add messages to console regarding cross attention optimizations 2 years ago