>>507810933>FACIAL RECOGNITION CAMERAS CAN’T SEE THROUGH BALACLAVASWanna know something fucked?
Many dyes are completely transparent to IR, which means that a lot of clothing is entirely transparent to IR, but you'd never be able to know without checking with an IR camera.
https://youtu.be/trRHTPnZh0M?t=206
Standard CMOS and (now-rare) CCD censors can see a little bit into the IR range, and the only reason they don't in consumer devices is that manufacturers intentionally insert an IR filter in front of the sensor.
Military drones could use normal, consumer-grade sensors, just without the IR filter, and could very well be able to see through a blacked-out face mask, as long as the facial recognition model was trained on wider spectrum source material. Of course, specialized IR sensors only cost a few bucks these days, and there also exist full-spectrum sensors that can pick up everything between IR and UV inclusive.
Don't trust that something that looks opaque to your eyes is also opaque to a machine.
>>507811427The basis of all modern LLMs are text completion models. Literally just highly advanced autocomplete. Public-facing models always have additional training and/or content filters put in front of the text completion, but the "soul" of the model is always just text completion. "Stochastic parrot" is an apt term I've heard used.
The best way IMO is really just to download some foundational model your computer hardware can run and try playing around with it yourself.
I have no idea what sort of hardware you have access to, but a good place to start might be with a small model like this:
https://huggingface.co/meta-llama/Llama-3.2-3B
Using llama.cpp, any vaguely modern machine could run that on a CPU at bearable speeds, using e.g. a quant like this one.
https://huggingface.co/neopolita/llama-3.2-3b-gguf
Maybe use something like a free Google Colab instance if your hardware really sucks.
See >>>/g/lmg
This shit moves so fast I'm always a little behind.