True Goon Connoisseur
Discussion of Free and Open Source Text-to-Image Models
Prev:
>>105610994https://rentry.org/ldg-lazy-getting-started-guide
>UISwarmUI: https://github.com/mcmonkeyprojects/SwarmUI
re/Forge/Classic: https://rentry.org/ldg-lazy-getting-started-guide#reforgeclassic
SD.Next: https://github.com/vladmandic/sdnext
ComfyUI: https://github.com/comfyanonymous/ComfyUI
>Models, LoRAs, & Upscalershttps://civitai.com
https://civitaiarchive.com
https://tensor.art
https://openmodeldb.info
>Cookhttps://github.com/spacepxl/demystifying-sd-finetuning
https://github.com/Nerogar/OneTrainer
https://github.com/kohya-ss/sd-scripts/tree/sd3
https://github.com/derrian-distro/LoRA_Easy_Training_Scripts
https://github.com/tdrussell/diffusion-pipe
>ChromaTraining: https://rentry.org/mvu52t46
>WanX (video)https://rentry.org/wan21kjguide
https://github.com/Wan-Video/Wan2.1
>MiscShare Metadata: https://catbox.moe | https://litterbox.catbox.moe/
Img2Prompt: https://huggingface.co/spaces/fancyfeast/joy-caption-beta-one
Archive: https://rentry.org/sdg-link
Samplers: https://stable-diffusion-art.com/samplers/
Txt2Img Plugin: https://github.com/Acly/krita-ai-diffusion
Bakery: https://rentry.org/ldgcollage | https://rentry.org/ldgtemplate
Local Model Meta: https://rentry.org/localmodelsmeta
>Neighborshttps://rentry.org/ldg-lazy-getting-started-guide#rentry-from-other-boards
>>>/aco/csdg>>>/b/degen>>>/b/celeb+ai>>>/gif/vdg>>>/d/ddg>>>/e/edg>>>/h/hdg>>>/trash/slop>>>/vt/vtai>>>/u/udg>Local Text>>>/g/lmg>Maintain Thread Qualityhttps://rentry.org/debo
>>105615405 (OP)best for face swap? I have flux dev and flux inpainting working on Comfy.
I know this sounds dumb, but is there a lora for a totally nonexistent woman, like a character, but not an anime one. Like someone's personal invented one, not from a game or whatever.
>>105615474Most of the ai generated images. You don't need a lora.
can anon poast a migu or two pls
>>105615474and how did they train this lora on a person who doesn't exist? pure imagination?
Blessed thread of frenship
fill this fucking thread first
>>105611096>>105611096>>105611096fucking retards
>>105615521since you asked nicely
>>105615521that was the spite troll bake tho let it die
Blessed thread of true friendship
>day four
>still no NAG implementation for flux, chroma, or sdxl
o v e r
>>105615560Is NAG supposed to make flux faster or have better prompt comprehension? Or both?
>>105615579It does affect speed I think, but that's not why people are interested
the main reason I want it is because it gives a negative prompt to Flux
>>105615405 (OP)Thank you for baking this thread, anon.
>>105615518Thank you for blessing this thread, anon.
https://huggingface.co/Kijai/WanVideo_comfy/blob/main/Wan21_T2V_14B_lightx2v_cfg_step_distill_lora_rank32.safetensors
chat, is it better than causvid?
>>105615474prompt for a random name and a couple specific features
https://github.com/Zehong-Ma/ComfyUI-MagCache/commit/a9a303729d3108e63b3a52711f5248735742114a
this works on chroma now
>>105615843is this a replacement for teacache?
>>105615859yeah, it's supposed to be better than teacache
>>105615868thank god, kijai getting btfo lately
>>105615890I wish the NAG guys will do the same for Chroma too, Kijai only made it work on Wan
>>105615757>https://huggingface.co/Kijai/WanVideo_comfy/blob/main/Wan21_T2V_14B_lightx2v_cfg_step_distill_lora_rank32.safetensorsthis is really good
>4 steps>LCM sampler>work with I2V (I tested that one with wan i2v 480p)>took 1.30 mn on my 3090
>>105615953it also works with NAG, dunno if NAG makes it better or worse so far (I noticed that for the both of them the luminosity dropped but it's less obvious for NAG on that one), needs more testing
>>105615953>>105616004need a harder prompt with more movement
>>105616018give me one, my imagination is so bad lol
>>105616031The man grabs his chest before starting to spin around while jumping and then running off into the distance.
>ranfaggot containment thread
>>105616045>The man grabs his chest before starting to spin around while jumping and then running off into the distance.kek, is this working at all if you go for Wan at 30+ steps though?
>>105615633Yeah negative prompt would be nice. I used to use it in conjunction with dynamic thresholding but it never really worked well and stopped bothering.
>>105616078I assume it wouldn't work well regardless but that's the point of a complex prompt to benchmark something. It seems like currently you need to either keep the prompt to mostly 1 action or explain everything in absolute detail and hope it will understand it better.
>>105616097>I assume it wouldn't work well regardless but that's the point of a complex prompt to benchmark something.Yes, but the point is to see if opting for this lora maintains the quality of the "normal" Wan settings, I wouldn't expect it to make it better than what the Wan can/can't do normally.
>>105615953works well with loras too
https://civitai.com/models/1439393/kamehameha-energy-beam-i-wan-21-i2v-lora
this amount of jiggle is totally acceptable for a distill lora
>>105616171I also tried to add SLG but it didn't change anything
https://github.com/Clybius/ComfyUI-ClybsChromaNodes
NAG for Chroma is here, let's goooo (the node name is ChromaNAG
>>105616265>install it>errors out on genAmazing
>>105615405 (OP)update your neighbors list:
>>>>/vp/napt
>>105616547Don't post a single "R" gen for a month straight and I'll think about it
https://www.youtube.com/watch?v=RXRy1uSuYmM
>1 step model
>can do a 1 minute long video
damn
>>105616566i dont negotiate with terrorists
>>105616566Those "R" girls are supposedly his OC Rocketgirls.
I am a Pokefag too, but that retard is too obnoxious, unoriginal and can't even bother making interesting things with the concept other than spamming images that were likely generated with the same prompt he doesn't bother changing.
There is plenty of Rocketgirls pokemon fanart on the internet he can take inspiration on, but for some autistic reason he is obsessed making the same multicolored hair character in bikini over and over completely unrelated to the Team Rocket lore other than the red R
anon should apologize to miguposter
>>105610994 miguposter would bake when the main baker was away we should all be grateful for him in our darkest moments he was there for us
>>105616643No? Love it or hate it, the faggolage is an /ldg/ tradition and the script to make them isn't hard to install and use. Posting your own or other anon's single-gens is cringe and /sdg/ coded.
>>105616593I hope Alibaba copies Bytedance's homework and delivers us a Wan 3 with a new arch and optimizations.
But I am starting to wonder if they would even bother open sourcing it at all, since the community already gave them plenty of inference optimizations they likely hoped for
>>105616643>>105616625wtf am i reading
also my Rgirls are in a different\workflow style almost every single post kek
i think you need to relearn what "spam" actually is\means
>>105616671>since the community already gave them plenty of inference optimizations they likely hoped fornot only that, but they got so much free advertismement from us, everyone knows what Wan is now, so for people who want to try models in the API they'll think of going for that model at some point
>>105616643where were you when the thread fell off the catalog
>>105616662where were you when the thread fell off the catalog
>>105616682>>105616671>But I am starting to wonder if they would even bother open sourcing it at allThere must always be someone at the top of FOSS, collecting more optimizations and eyes on their project and model. There must always be the next release, if not Wan 3, then something else.
>>105616700>if not Wan 3, then something else.I wouldn't mind an image model, those guys know what they're doing
>>105616593Pretty neat, cant wait to see the requirements. Someone is working on a wan 1 minute version but theyve been pretty silent: https://github.com/DualParal-Project/DualParal
These threads are proof that this board is retarded and that nobody really understands technology or programming or any of the mechanisms that make the world function.
I got the impression when half the threads were about <product> wars and <eceleb> opinion takes about the latest one-dimensional step forward in terms of performance that all equates to <bigger number> for <raytraced video game>.
I've browsed 10 of these threads and I am entirely convinced that none of you even know what "AI" is on the most basic level when it comes to generating photorealistic images that can fool the human eye and do everything inbetween. Like, the most basic level. Grandma knows what "AI" is but not the term "model" or "backpropegation" and has never bothered to learn because it's supposed to be off-limits in the land of magic tech-wizards waving their dick over a keyboard.
No one comes here for a nuanced discussion on how these things work, they're just here for tits and whatever random bullshit their mind can imagine, but still.
There's being retarded in one dimension and then there's "retard energy", and these threads are palpably retarded.
You give /g/ some of the most mistifying tech ever created and the depths of their creativity go as far as a bunch of pajeets and boomers on Facebook squinting at a kindle and go "Wow technology sure are have come far" while looking at an image of a corgi shooting a machine gun and wearing a cowboy hat.
>>105616709>I wouldn't mind an image modelReminder that Wan is also an image gen model, but it's too big and slopped to be used as such so people skipped that.
But if they delivered something Seedream/Mogao tier, a big model that knows a fuckton of styles, non-slopped and with excellent image compositions, some non-vramlet anons such as myself would enjoy it
>>105616711>guy who cries out in pain as he doesn't contribute to the quality he seeks and is of course, a nogenpottery. and also what you said is mostly a lie, given that people talk about research here whenever something of note drops, but when you have a fast general, you basically have a "live" chat where most comments will be from people who already know the things they want to know and are mostly talking about the newest bleeding edge tech or more likely the problems they are having making it work
why would anyone talk about basic ml terminology with randoms online instead of just reading papers on it themselves? why would anyone who is heavily into research talk to randoms here instead of fellow researches he knows and works with? why would anyone implementing things primarily talk here instead of with other programmers on github?
you are retarded
file
md5: 1ecd33519ad843d329251094cd15d09c
🔍
>using llm bait
>>105616711noooo not muh booba! nooOoOoooo
what type of discussion are you attempting to have here on 4chan exactly?
no one cares what lora you are training retard
no one cares about your workflow
>not until its a finished product atleast ;3
>>105616772anon falls for it every time :-(
>>105615953it looks really good on 720p
>>105616700>There must always be someone at the top of FOSSWe were betrayed in the past by Stability, then Tencent (hunyuan), and now even BFL apparently (it is starting to look like the Dev version of Flux Kontext will not happen), I wouldn't put past that an hypothetical Wan 3 may not be open
>>105616824>then Tencent (hunyuan)why tencent?
>>105616824Right, but I'm saying someone is always at the top because of the benefits that it provides, it doesn't matter who. An easy way for a new up and coming company or a big one to take a portion of the market at no cost given they can't compete with top closed models.
>>105616832Their new image gen + image editing model is not open.
Also some anons were seething in past threads that their best 3Dgen model was API-only. I saw they released a new open 3Dgen model, but I don't know how it compares to whatever they have in the API
>>105616824>Flux KontextThe version they want to open is trash anyway
>>105616832I assume because they didnt open source 3d v2.5
>>105616835at this point it's getting harder and harder for an upcomming company to release a local model that'll blow everyone's mind to the point that'll be the only thing we'll talk about, if they release a video model we won't give a fuck unless it beats Wan, and if it's an image model we won't give a fuck unless it beats Flux/Chroma, the bar is kinda high now
>>105616824I almost forgive BFL since now the fur tuned schnell is looking like it'll become the new meta. I don't entirely forgive them, they made anon work for it, but they will have a hand in saving local after all.
>>105616859I think they regret the fact they released an Apache 2.0 licence model (Schnell), because we managed to beat flux dev with it, that's the reason they'll only release Kontext dev, they don't want us to get a model with a cool licence that we could improve anymore, they really close the gate this time
>>105616769Again, there's specific retardation and then there's just general retardation
These threads are the second one, it's "retarded energy" where a bunch of people scratch their chin and laugh where they come up with an infinite number of combinations of adjectives for prompts like "Keanu Reeves holding a lightsaber and doing a kickflip over a giraffe wearing a tophat"
If you show someone Wi-Fi that has never heard of it before, the response from someone with half a brain is "How is it possible for devices to interact with no visible mechanism? What is this sorcery?"
You show it to /g/ - Technology and they say "aw sick I can stream my hentai without having to get the ethernet cable"
>calm downNo fuck off. This isn't reddit.
Figure out how to bang enough neurons in your cranium around to contribute something more substantial than these absolute mind-numbingly stupid posts. Please.
This is slop. It is the worst general on /g/.
The worst general on /ck/ is a bunch of walruses talking about the new McChicken going "uhhh I wish I could eat that".
The worst general on /pol/ is a bunch of nincompoops talking about slop breadtuber drama.
It's borderline /b/-worthy. It's related to technology only in the most literal sense.
I can't do anything about it because /qa/ doesn't exist, so keep doing this. I'm just popping in to say that these threads are absolutely braindead and it's not compatible with /g/ acting like some kind of authority on tech.
>>105616852Sure, it doesn't have to be some random no name company, there are enough big players that also all benefit for opening their models since not all of them can be at the top in closed models, so if you are falling behind might as well drop the work that you already done open source.
And also, engineers in big companies can leave and just with some new training knowledge easily create new companies and release a very good model, this happened many times over already, although usually not open source.
But at the end of the day, the longer it goes without no one releasing an open model, the more optimizations pile up and the work required to beat top open models that already exist is lower and lower until more and more companies can easily do it and then release a model to be on top of the open source community.
But there is also no point in thinking about this given the actual reality and competitiveness of the market makes us get a new big model every few months for most modalities anyway.
>>105616872yeah you tried to hard with that one, pissing and shitting all over the place while continuing to not contribute anything agian, thanks for being the funniest clown in the circus doe, kek
LDG paves the way for everyone else desu. All other AI threads are downstream of this one you realize after awhile desubeit.
>>105616864I'm sure whoever's in charge of the money regrets it, but surely there's at least one chad on that team who knew what would actually happen.
>>105616900>more white men bait in the threadwow
I just hope some autist fine-tunes Chroma on image editing stuff once training is done. Or at very least make an inpaint version of the model (like Flux Fill)
>>105617110lmao suck my dick
>>105617110>>105617114Hahaha! This is ranfaggot's safe space, this is why he made /ldg/ in the first place. He couldn't get enough attention in /sdg/ or even in his discord server.
>>105617120surely if you spam that lie enough you can trick newfags into joining your dead shitty thread.
>>105617132you just use the workflow from here
https://rentry.org/wan21kjguide
then you add the distill lora on top of it at strength 1, you go for cfg 1, 4 steps and lcm sampler
>>105616004>dunno if NAG makes it better or worse so farI think it makes it better
>>105616906schizo website
https://huggingface.co/Kijai/WanVideo_comfy/blob/main/Wan21_T2V_14B_lightx2v_cfg_step_distill_lora_rank32.safetensors
this shit is magic.
use a causvid workflow or any lora workflow. settings: 1.0 strength, cfg 1, shift 8, steps 4, lcm scheduler
anime girl Miku Hatsune eating a Mcdonalds cheeseburger.
>>105617144Which node you are using for NAG?
>>105617193that one
https://github.com/kijai/ComfyUI-KJNodes
what is NAG and how does it work with this new lora, which seems fine by itself?
>>105617141Why are you so obsessed?
>>105616872>calling this the worst general on g when aicg exists
miku puts on a black leather jacket
open source has done it, fast i2v is now feasible. 80s with interpolating
>>105617186Says t2v though. Obviously it works with i2v, but is he training an i2v specific version that improves quality?
>>105617308>Obviously it works with i2v, but is he training an i2v specific version that improves quality?I guess so, and let's not forget that's an unofficial release, and kijai extracted it as a lora, even with all those "what if" the quality is still excellent, the self forcing thing is the real deal, no doubt about it
>>105617308it works amazingly well despite being a t2v lora, which is pretty cool. now imagine a proper i2v one. outputs are already great so far, and super fast.
>>105617250>fast i2v is now feasible.I expected a small model to get the same quality of Wan, but now that going for a few steps keeps the quality, I think the future is big models that work on a few steps, took some time but they finally figured it out
>>105617250>open source has done it,we're so fucking back!
https://youtu.be/OATUEO0PxLQ?t=40
Are there models or tools designed specifically for very low resolution sprite sheet/sprite work (64x, 32x etc) like designing tiles, item sprites and things like that?
For archiving purposes, this was the split thread
>>105611096>>105611096>>105611096For archiving purposes, this was the split thread
>>105617413Levels of back: PEAK
NAG seems to work for Chroma, but it gives pretty brutal 30% slowdown for generation time.
>>105617481>NAG seems to work for Chromahow did you make it work? I got those errors
>>105616515
>>105617487I just installed it with ComfyUI manager and restarted comfy. I just dont get why there's only one connection for conditioning instead for both positive and negative
>>105615405 (OP)>Maintain Thread Quality>https://rentry.org/deboIs it fair to say that this Rentry doesn't make sense?
>>105617509>I just dont get why there's only one connection for conditioning instead for both positive and negativethat's because it's not touching the positive calculation, only the negative one
>>105617516It's the *off site archive link* part of this.
>>105617424>rantroon spamming his safe space threadMany such avatarfags.
>>105617516My first post here btw.
>>105617481>it gives pretty brutal 30% slowdown for generation time.did you notice an improvement on quality?
>>105617540*kisses your forehead* nighty night anony
>>105617481That makes no sense. NAG lets you set cfg to 1, if anything you should see a big speed increase given chroma's default cfg is like 4.5
super fast i2v memes, thanks light lora
>t2v
>still works fine for i2v
>stellar blade comes out for PC
>infinite new mods and AI tools happen
the perfect woman doesnt exis-
"asian girl turning around while in a swimsuit."
>>105617481>NAG seems to work for Chromacan you show a screen of your workflow, I can't make it work
What makes e4m3fn_fast use a tiny bit less VRAM than e4m3fn?
I get oom if I try to run wan 720p fp8 on e4m3fn on my 3090, but it works with e4m3fn_fast.
>>105617585>I get oom if I try to run wan 720p fp8 on e4m3fn on my 3090, but it works with e4m3fn_fast.go for Q8 my dude, GGUF lets you offload a bit of that model on your ram, you won't get OOM anymore and the Q8 quality is better than fp8
https://rentry.org/wan21kjguide
>>105617592Cool. I've got a 12gb 3060 in the machine too, will that node let me offload to that instead of ram for extra speed?
>>105617601yeah you can definitely do that, you have to put "use_other_vram" to "true"
Tried lightx2v lora + NAG. Quality & Motion took a noticeable hit. Too big for me to consider this worth it. Nice if you want to play around with memes though.
>>105617610Nice, thanks anon
>>105617615I also have a second 12gb vram card but I'm using it to keep the text encoder in it instead, I noticed that the difference of speed wasn't that big when you offload on the ram compared to offload to a second gpu
lasers
md5: cf3ddf703de526cd7143d9909e337295
🔍
>>105617614it depends, for me it's super smooth
lora 1.0 strength, cfg 1, shift 8, steps 4, lcm scheduler
interpolated output from rentry workflow:
>>105617614>Tried lightx2v lora + NAG. Quality & Motion took a noticeable hit.for me the quality is really close to the real Wan, and let's not forget that:
- it's an unofficial implementation (it's likely the official one will get a way better quality)
- KJ extracted it into a form of lora, what if he made a Dora instead (better quality)
- it's for T2V but works surprisignly well on I2V, what if the official implementation will also have a specialized I2V version?
in other words, we are far from reached the full potential of self forcing so far and I'm alread impressed as fuck, this shit is the real deal
>>105617640how do you add in NAG? ive only used the lora so far for stuff like
>>105617631
>>105617626Thanks, I'll experiment then. I just run text encoder on cpu usually, since I don't change the prompt often so it's not a big deal to wait 15 seconds on the first gen for it to run it on cpu
>>105617644>how do you add in NAG?like this
>>105617196
infinite meme potential cause it's fast now. like 70 seconds.
>>105617631Being smooth isn't the problem. For example, for an image I used, a portion of the guy started morphing into a face and it never did that before out of the 50x gens I did without it.
Then again I'm doing strictly I2V. Perhaps I'll do more tests first to make sure it wasn't just a bad seed.
>>105617651kek I know the luminosity decrease wasn't intended, but that makes a cool effect with that context
>>105617648are you supposed to crank up nag_scale?
>>105617668I didn't touch the default settings, it works fine that way
>>105617465I need to see if I can make SLG work, I'd like to get rid of the distortions on fast movements
On a 3090, it gave me a 720p gen in 5 minutes vs 45-50 minutes with it off. At 480p, the visual quality drop is noticeable compared to a normal gen (more aliasing, for one), but on 720p, it looks pretty damn close to a normal gen.
The motion does take a hit. I don't know how much, but of the couple of gens I've done, I've noticed they're a little more stiff than a normal gen. I'd have to do side by sides to see how much though.
Still... pretty fucking impressive, all round.
>>105617671also when genning at non default size (600x480 vs 832x480) it seems faster, I just scale one side if the original image would be cropped poorly with the default size.
a new era of videoslop is about to descend upon this general
>>105617717just wait until the SDXL fags are able to use it
>>105617705>Still... pretty fucking impressive, all round.and it's just the begining, we haven't reached the peak yet
>>105617640
blue hair anime girl rei ayanami holds up a sign saying "BEST GIRL".
interesting results
outputs are a lot more stiff compared to regular wan, its biased towards slow motion too
aww
two anime girls hug each other.
>>105617738>its biased towards slow motion tooYeah, noticed that too. Hopefully that can be fixed with a proper implementation.
wait what wan is fast now?
>>105617756yes, like just over a minute for a video now. new lora is super fast.
>>105617186Is there any documentation on what this is exactly and how he achieved it?
>>105617742not a big fan of the luminisity change, do you have NAG on top of your workflow?
>wan 480p i2v now generates a 5 second video faster than Chroma generates 1 image
seems silly, flux architecture must be way slower than it needs to be
>>105617763I assume it's like previous turbo loras but in this case, it's not just faster but the quality is much better. Causvid for me was faster but the quality was pretty bad. This is MUCH better.
>>105617765no NAG yet I just started trying it. but most gens dont have the luminosity shift
>>105617763>Is there any documentation on what this is exactlyof course
https://chendaryen.github.io/NAG.github.io/
>not a single mention of nvidia cosmos
meme general
>>105617763It's basically this, extracted into a LoRA.
>https://self-forcing.github.io/Combined with NAG.
>https://chendaryen.github.io/NAG.github.io/
>>105617777>not a single mention of shitvidia shitmosGee, I wonder why that is
>>105617777a bunch of us here tried it and posted about it the day comfy implemented it, anon
consensus was it's ultraslopped
>>105617782make asuka bite the sign
>>105617717it's only just begun
tested on 720p. it's not bad, but the stiffness makes it a no go for me. they need to improve that.
the only thing holding wan back, was gen time. with teacache it was still pretty fast, 400-500 seconds for a meme isn't bad for free.
now it's super fast. faster than 50 step chroma.
>>105617793noodle shit freedom when?
>>105617793Nobody cares about your shit UI when you shill non stop and have your disabled attack dog siege the general for weeks non stop.
Also for the animation anon you have been getting mogged by /ldg/ anons well before you rage quit it for being shit.
Fine, I give up, what is a good inainting workflow for comfyUI?
I want to
>Change clothes for other clothes
>Edit clothes, materials and colours
>Fix anatomy mistakes
I want to maintain the artsyle of my images.
I'm not used to comfy but whatever, I want good inpaintings.
>>105617693SLG seems to change the output, but it doesn't have that special effect like it had on regular Wan, maybe there's another layer that has to be targeted this time Idk
54
md5: f5546e28457222b1a9f83ad55822d07c
🔍
heh
>>105617807and even cuter sign:
>>105617806>dog siegethose are cat ears
>>105617829Why do you even post here?
>>105617837I locally diffused some images, which is the particular topic of this image posting thread
>>105617819>Selected blocks to skip uncond on: [9]that's what the console says, since there's no uncond anymore on CFG (we're on CFG 1) maybe that's why SLG has no effect, but the uncond is still used by NAG so... there should be a SLG targetting NAG or something?
>SAY WHAT AGAIN, MOTHERFUCKER
>Jullien claims for years they will save anime.
>Doxes themselves multiple times for reasons unknown.
>Instead releases a lacklustre wrapper that doesn't compile.
lol, lamo
>>105617803idk. still trying to juggle the plugin crap. I would like to ape how gernov is doing his server so I have to make sure plugins don't pull in a bunch of imgui crap. what will really be good is applying dlss to world model simulations since that's who I've been talking to lately. that and I keep getting the itch to go to llms but I'll keep at diffusion and the main application for now
>>105617757>>105617758can someone pls post a workflow for this?
>>105617856this is really clean desu
kek
blue hair anime girl dives into a swimming pool at the beach.
>>105617889the gens turn out good and the rentry workflow has interpolation + original video. so the interpolated outputs can look very smooth.
>>105617878>does nothing but bitch moan and complain>extremely jealous of an anon that actually takes action when he doesn't like the uncomfyui experience>instead of being supportive you piss shit and cry every time he is aroundwho unironically lives like this?
>schizo hates comfyui
>schizo hates the wip alternative
what doesn't he hate? just gradio?
unexpected but neat output
blue hair anime girl flies high into the sky like superman.
>>105617886https://rentry.org/wan21kjguide/edit#lightx2v-nag-huge-speed-increase
>>105617903so far he only posted dalle and o1 gens so I don't even think he can gen locally
>>105617881>applying dlss to world model simulations since that's who I've been talking to latelyhow the fuck do you network so much?
yes cysthar your samefagging isnt obvious dont worry
>>105617890i thought she was gonna bonk her head on the railing. I was prepared to laugh, but didnt
>>105617897he thinks if he spergs hard enough, he'll be able manifest a safe space
>>105617913ty gonna try with the nag node, all my rei gens have been just with the lora at 1.0.
any NAG implementations yet for models other than Wan?
>>105617936sadly no, no implementation in any UI for image models yet
an anon here tried having Claude do it based on the paper but it didn't seem to work
>>105617931first test: asuka with a plush doll
>>105617936there is a chroma one floating around but it doesn't really give much of a boost. better than nothing.
>>105617943I lost it when the doll opened its mouth
>>105617936there's one for chroma but I can't make it work
>>105616265
>>105617943another doll, but interpolated:
this lora + nag is magic, and I thought teacache was magic before that. open source is amazing anons
Anyone tried it with RifleXRoPE? 3 extra seconds, 8 seconds in total instead of 5. It really inflated gen time before, but with this lightnag thing, it shouldn't be a problem
>waiting for that kontext model to play with
>this stuff comes out
>i2v is literally faster than high step flux gens
amazing.
okay but itll never run on a turbo vramlet setup
can anon animate more doros please
>>105617315>>105617319
I seem to remember an "exe to install everything you need with one click" app named Pinokio or something like that
The site was pinokio.computer but it appears to be down
Anyone know what happened with that?
>>105617897>>105617929You faggots hate this thread and are so buck broken you have to post here samefag and jerk each other off while shilling. It's pathetic which is why nobody wants to help you ani, look at the company you keep, you backing and using debo for your gay shilling campaigns is one of the many reasons why anons hate you.
You just had to be normal ani, you didn't have to do half the retarded shit you have done and you might have people actually willing to help you instead of begging and seething here.
>>105617959>open source is amazing anonsChina is amazing, they're the one saving us again and again lmao
https://self-forcing.github.io/
>>105617913>>105617902thanks bros, I'm high as fuck and can barely read.
also HOLY shit is it fast on my 4090
>>105617995this one is snake oil. too much degradation. china fails us as much as they win for us
>>105617992the one and only thing more pathetic than ani is the anon who thinks anyone he disagrees is ani and spergs out about ani constantly
What you think people
They learned their lesson or need more?
>>105617997>this one is snake oilnuh uh, it's already working great and it's just getting started
>>105617640
>>105617995yes, china care about results while scam altman cares about making people pay $1000 a month for 3 prompts on scamAI
>>105618003>no vidsnake oil saleman. see
>>105617250
>>105617988>pinokio*vomiting noises*
>>105617846kek I skipped anything and nothing changed, SLG is no more in this new paradigm
>>105618015nuh uh, China has won, keep seething
>>105617465>>105617413
>>105618001Ani and friends will never learn because they suffer from severe mental illnesses which is why Ani is a pariah and can't get help and needs to beg here of all places.
What's even more pathetic they like to project all of the bullshit they got caught doing here and wonder why nothing they try succeeds but instead of learning their lesson they will instead cope by saying that one person can magically be online at all times of the day to prevent them from getting what they want.
>>105618003rentrybro added it to the rentry and he never adds snakeoil
>>105617995I tried to join Chinese AI groups to follow their frontier AI research but you need a Chinese phone number to make a QQ account it seems, it's over
>>105617529Is this like the realization of some mid to late 30s guy that he is no longer the main demographic of 4channel
>>105618040>rentrybro added it to the rentry and he never adds snakeoileven KJ God has marked this technique with his seal of approval
https://www.reddit.com/r/StableDiffusion/comments/1lcz7ij/comment/my4nuq2/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button
>They have truly had big impact on the Wan scene with first properly working distillation, this one (imo) best so far.
>>105617529the screenshot in this post is 100% accurate
>>105618061You're totally reliant on whoever made the package to keep it updated for you, where if you do things yourself with self contained environments/python venvs, it's easy to keep everything up to date and cutting edge
>>105618059KJ will do anything to chase the high of being top dog before Cumfart put him in his place
>>105617997it's not snake oil. the degradation is really only in the motion. As others said, you will get stiff/slow motion from it. Can be useful if you don't need fast movements.
Should probably put a disclaimer somewhere though warning people.
It's faster but there's a pretty obvious loss in quality. Still better than what we had before, I think I'm still gonna stick to full WAN most of the time.
Left is without NAG + self-forcing lora, right is with. left gen took about 12 minutes, right took 2.
>>105618094what if you increase the steps a bit, try to go for 6 steps
https://www.reddit.com/r/StableDiffusion/comments/1lcz7ij/comment/my63d8a/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button
>From my testing, for i2v, the shift should be lowered a lot for self forcing. Even a shift of 1 was fine with 8 steps. Otherwise the source image is changed too much.
chat is it true?
everyone I've seen using self forcing had really bad artifacts. can you point to a quality video and I'll believe you?
>>105617995>China is amazingYes
this shit is so mindblowing fast
87
md5: 44eec22a5a37ebe1c81302c9d8a302e4
🔍
I AM OOOPSCALING
the video gen - image gen speed gap is closing...
so what is the NAG node/stuff doing? how does it work in conjunction with the speed lora? the lora worked fine on its own, curious how it affects outputs.
>>105617982will try, sec
>>105618119flux is too shit of a model to keep us held back. chinabros will demolish it soon I hope
>>105618124got a super cute doro, not a backflip but still neat.
>>105618130so slowwwwwwwwww
>>105618129>chinabros will demolish it soon I hopelet's hope so, I also want a Wan revolution on imagegen
>>105618130can you add (running, speed-ines)
>>105618134pink hair anime dog jumps high into the air. the camera pans out to show her high in the sky.
>>105617968It works fine with RifleXRoPE. I pretty much use rope all the time now
>>105618138smoother doro:
I got a friend who has a 1050Ti and I just got him ComfyUI,
>>105618142at 129 frames? first try seemed to cause ghosting and looping
>>105617999not really. most anons here are more pathetic than ani. ani actually contributes unlike the rest of you bums
>>105618152Can pre-rtx hardware even run it?
>>105618157it's almost igpu levels of pain
>>105618149what did you change for this one? just the seed?
cute!
we are in a new age of i2v as of today.
https://rentry.org/wan21kjguide/#lightx2v-nag-huge-speed-increase
>>105618104what does shifts even do
>>105618165it's how shifty the video is :^)
>>105618163just another gen, seed is set to randomize.
good doro posting density
>>105618155>most anons here are more pathetic than ani. ani actually contributes unlike the rest of you bumsYou have no idea what I do. I could be a cancer researcher for all you know.
>>105618165it raises the sigma a bit, it's supposed to be a method to improve your result when you have few steps, but if the value is too big you can get some burn out of it, like setting the cfg too high
remember when wan videos took 15 min?
>>105618104been meaning to ask about shift. i couldn't seem to get a cup to fall over without lowering shift to 1, which caused other issues.
caveat, was bouncing between causvid and other speedup loras on 3060 so maybe something else was contributing.
>>105618178they still do if you want maximum quality.
>>105618178like it was yesterday
>>105618181>was bouncing between causvid and other speedup loras on 3060 so maybe something else was contributing.self forcing is supposed to work alone, don't associate it with other distilled loras
>>105618155All night I've been getting an error whenever I try to upload images, so I've given up and have accepted my fate as a filthy nogen, at least for now
>comfy C in tabs changes fill based on the gen progress
nice touch comfy anon.
>>105618171is that why you cause cancer around these threads?
doro to the moon!
pink hair anime dog flies very high into the night sky filled with stars and the moon. she leaves a rocket trail behind.
>>105618178>remember when wan videos took 15 min?I don't miss that time at all, I welcome this new era with open arms
>>105618190thought it was just me
>>105618192the logo itself is putrid tho
>>105618197>pink hair anime dog barfs up a chew toy then eats it againpls
>2 minutes for a 5 sec video on a 3070 ti
vramlet bros are winning
>>105618230reminds me: that magcache node cut chroma gen times in half. must be instantaneous on a fast card.
>>105618230it gave him the mouth of ben affleck's batman
>>105618238didn't it have huge quality loss though?
>>105618183this
its just the vramlets are hypemaxxing this grainy stiff shit
>>105618239>C'mon Karl, keep pumping those videos I have a second boat to buyTruly the king of kong!
did anyone tell >>>/gif/vdg yet
>>105618240helmet is wrong too
we are in a different world today than yesterday.
interwebs died. going to goon before bed. have fun tonight anons!
It's nice. The motion is super smooth though. Like a slopped version of AI video. Wasn't causvid better than this?
>>105615474yes i found some on civitai. some plain realistic women.
>>105618258kill yourself julien
>>105617823dil' pujp_ guper gay fandom still going strong