← Home ← Back to /g/

Thread 105616701

317 posts 52 images /g/
Anonymous No.105616701 >>105616740 >>105616880 >>105617037 >>105617883 >>105618110 >>105618342 >>105618402 >>105618463 >>105618468 >>105618478 >>105618864 >>105619106 >>105619126 >>105619709 >>105620232 >>105620302 >>105620312 >>105620329 >>105622150 >>105623447 >>105631845
Do crabfags really?
Anonymous No.105616740 >>105616745 >>105619113 >>105622414 >>105627465 >>105632235
>>105616701 (OP)
Now how many in C99 ?
Anonymous No.105616745 >>105616751 >>105616837 >>105620264
>>105616740
None. You have to write it from scratch.
Anonymous No.105616751 >>105616797 >>105623187 >>105623394 >>105623394 >>105625015
>>105616745
Will it be memory safe?
Anonymous No.105616797
>>105616751
that's up to you
Anonymous No.105616837 >>105616873 >>105616994 >>105618870 >>105623394 >>105623554 >>105631772 >>105631985
>>105616745
Why do cniles LOVE wasting time reinventing the wheel? They wake up and jerk it to their 14838th revision of their personal implementation of a bubble sort that is 1 ten thousandth of a millisecond faster than John Doe's on GitHub. Meanwhile the software they've written it for remains in a dusty pile of unfinished projects. Would it kill you to stop goofing off and finish something?
Anonymous No.105616873 >>105617825
>>105616837
It's in curl, which comes with the os on any decent system. So the correct answer to anons question is 0 deps.
Anonymous No.105616880 >>105619066
>>105616701 (OP)
to be fair URL have become a complete mess, I wouldn't be surprised if the relevant ISO documents were thousands pages long, just look at how they encoded unicode with punycode or whatever
Anonymous No.105616887 >>105617037 >>105622045 >>105622058
in JavaScript I merely type new URL(url) and it is parsed
Anonymous No.105616983
It's that simple, huh?
CURLU *h = curl_url();
rc = curl_url_set(h, CURLUPART_URL,
"https://example.com:449/foo/bar?name=moo", 0);
Anonymous No.105616994 >>105617850 >>105617883
>>105616837
there's a happy medium between 44 dependencies and doing it yourself
Anonymous No.105617037 >>105618468
>>105616701 (OP)
>>105616887
If URL/path parsing isn't part of your language's standard library at this point, it's either old as fuck, not a GPL, or a meme language.
Anonymous No.105617825 >>105618085
>>105616873
>comes with the os
Then the correct answer is find /usr/lib -type f | wc -l

Maybe plus same thing but on /usr/include to account for header-only deps
Anonymous No.105617850
>>105616994
There's a happy medium between 1 dependency and doing it yourself. It's 44 dependencies.
Anonymous No.105617883 >>105618070 >>105622874
>>105616701 (OP)
Cargo is just as bad as Node.
Rustfags copied the microdependency model, also the async cancer from JS.
>>105616994
The C world has plenty of general purpose libraries that have everything you need and aren't split into microdependencies.
Anonymous No.105617910 >>105618135 >>105618478 >>105621658
2 cli file browsers with nearly identical functionality. one in c and one in rust.
Anonymous No.105618070 >>105618120
>>105617883
You don't understand why npm is bad
Anonymous No.105618085
>>105617825
I assume you'll be adding that to the dep count for every rust program as well
Anonymous No.105618110 >>105618377
>>105616701 (OP)
my first rust program was one designed to download images from an api and it needed 139 dependencies and took upwards of 50 seconds to link each time never again
Anonymous No.105618120 >>105618373
>>105618070
Someone leftpad this anon.
Anonymous No.105618135 >>105621658
>>105617910
i found yazi but can you link me to lf?
Anonymous No.105618342
>>105616701 (OP)
Trust the plan, sister.

>t.Q
Anonymous No.105618373 >>105618452 >>105622741
>>105618120
Case in point. You don't understand how leftpad happened in npm and how it couldn't in cargo
Anonymous No.105618377 >>105618398
>>105618110
>*I needed 139 dependencies
Anonymous No.105618398
>>105618377
ok retard
Anonymous No.105618402
>>105616701 (OP)
Oh god, this is worse than NPM.
Anonymous No.105618452 >>105618488
>>105618373
you're saying that it's mechanically impossible for someone to upload some bullshit library that runs into a cascade of laziness until every important library is forced to have it for some reason or another?
Anonymous No.105618463
>>105616701 (OP)
In Python this is just
>from urllib.parse import urlparse
Anonymous No.105618468 >>105618489
>>105617037
>>105616701 (OP)
You don't have to have a batteries included standard library but your language should at least have relevant first-party libraries which are independent from the standard library.

languages like crablang are meant to be able to run in no-std environments, and there's an argument to be made that we should not obligate the distribution of batteries that might not be used in 10 years.
Anonymous No.105618478
>>105616701 (OP)
nothing wrong with recursive abstraction so long as the code is minimal and audited

>>105617910
>400x bigger
yeah, unjustifiable
Anonymous No.105618488 >>105618518
>>105618452
No, I am saying you do not understand how it happened in npm and how it couldn't in cargo. You also don't understand the difference between sufficient and necessary conditions
Hint: more was needed than "packages had dependencies"
Anonymous No.105618489
>>105618468
For example, Python has http, urllib, requests, httpx... I'm sure there will be more as time goes on.
I wouldn't want something as opinionated as the API of an HTTP library to take a language enhancement proposal to modify.
Anonymous No.105618512 >>105618527
microdependencies are good tbqh
I break out everything possible into mini libs. I'll write a lib with just 50LOC.
user auth that integrates with a main application's DB? its own lib with its own migrations that get incorporated into the main application's migrations. that incorporation logic? another fuckin lib.
Anonymous No.105618518 >>105618531 >>105618692
>>105618488
I'm not the same person, dumbass.
All I'm seeing is retards talking past each other just because one can't understand supply chain security and the other can't recognize developer ergonomics so you just pretend that you're both arguing about whatever's convenient for you at the time
Anonymous No.105618527
>>105618512
based clean coder
you are irreplaceable
Anonymous No.105618531 >>105618557
>>105618518
supply chain security involves personally vetting your applications or relying on a 3rd party to do that vetting for you, then vendoring all of your vetted shit. see `cargo help vendor`
Anonymous No.105618557 >>105618579 >>105618793
>>105618531
so your grand master plan to handle 44 dependencies to parse a string is to get someone else to read the code and perform background checks on the authors of each one?
You could have the resources of IBM and not be able to do that for any non-trivial application
Anonymous No.105618579 >>105618610
>>105618557
welcome to real life nigga
Anonymous No.105618610 >>105618639 >>105618671 >>105618793 >>105622834
>>105618579
in other words you don't have a real strategy and crablang's attitude towards basic shit like not making more work for other people is not encouraging.
Anonymous No.105618639 >>105618702
>>105618610
do you have a real strategy instead? if so, go ahead and point it out, because until you do, code to do things must exist in some place. just because you import 1 lib that does a bunch of different shit does not mean you have reduced your attack vector over importing 30 libs that each are dedicated to one thing.
Anonymous No.105618671 >>105618831
>>105618610
>16+GB of RAM to compile our compiler
>our compiler doesn't work on anything but AMD64
>Need support for other architectures? Too bad. We only support AMD64 and ARM
>We're safer than C! Don't worry about the random code being pulled in from all over the internet at compile time. It's safe we promise
>Stability? What's that?
>We support all these other platforms (note: please don't read fine print where we tell you they're 'tier 2' and don't even work when cross-compiled).
>Want to compile Hello World? That'll be 12GB of RAM and an hour of your time.
>Want to compile actual application? That'll be 25GB of RAM and three hours of your time.
>Do like it? Rent a server farm lmao
>We're going to save FOSS. Every big tech company has at least 2 people sitting on our foundation's board of trustees
>Original software? What's that? We're going to re-write grep in Rust then shame you for not using it
>Please add 3 lines of Rust to your existing project so we can claim people are using it to solve real world problems
>Making 100s of millions of systems obsolete is no big deal. Get with the times gramps
Anonymous No.105618692 >>105620403
>>105618518
I disagree. I originally responded to this.
>Cargo is just as bad as Node.
>Rustfags copied the microdependency model, also the async cancer from JS.
And my position remains the same, that it's not just as bad as node and they only think that because they do not understand why node was bad, so they projected it onto something they understand better - microdependencies. Cargo cult behaviour
Anonymous No.105618702 >>105618766 >>105618851
>>105618639
Your attack surface goes down dramatically when you only use first party libraries.
Is url a first party library just because it's developed by the Servo team? I don't know. Mozilla might not be around forever. That doesn't sound like a promise to me.
Anonymous No.105618733
It's impossible to compile the Rust compiler.
Try it.
Just trust our binaries, goy.
Anonymous No.105618734
>cniles don't understand that ascii is only enough if you are a subhuman
https://www.unicode.org/reports/tr46/
>Initially, domain names were restricted to ASCII characters. A system was introduced in 2003 for internationalized domain names (IDN). This system is called Internationalizing Domain Names for Applications, or IDNA2003 for short. This mechanism supports IDNs by means of a client software transformation into a format known as Punycode. A revision of IDNA was approved in 2010 (IDNA2008). This revision has a number of incompatibilities with IDNA2003.
> The incompatibilities force implementers of client software, such as browsers and emailers, to face difficult choices during the transition period as registries shift from IDNA2003 to IDNA2008. This document specifies a mechanism that minimizes the impact of this transition for client software, allowing client software to access domains that are valid under either system.
Anonymous No.105618766
>>105618702
I don't get the point of this thread. Literally nothing prevents (you) from writing everything from scratch cnile
Anonymous No.105618793 >>105618891
>>105618557
>>105618610
The amount of units code has been separated into has no impact on it's amount. You don't get more pie by slicing it into more pieces.
However, code is functionality, it serves a purpose. If two larger, complex functionalities each contain with themselves a smaller functionality that serves the same purpose in both then it can be separated out and reused. This is a net reduction in the amount of code.
You can see this pattern everywhere in programming, this is absolute bedrock. This is why cniles here get called nocoders, because they fundamentally do not understand what a function is and their purpose. And you can either be taugh about it by getting a proper comprehensive education instead of "learn to code" shit, or by intuiting it after writing enough code.
Anonymous No.105618831 >>105618939
>>105618671
>you don't support these platforms!
>we do, at tier 2
>it should be tier 1!
>the difference between tier 1 and 2 is dedicated hardware to run test suites on. How important is tier 1 for you on this platform, in dollar amount?
>woah woah, calm down, it's not THAT important to me
The irony here is that 95% of freetard support claims are at best what would be considered tier 2 by rust, but they simply have no tier 1 equivalent, which you don't understand. So you compare one project's best tier of support with another's best tier without understanding these bests are not the same
Anonymous No.105618851 >>105618891
>>105618702
You attack surface goes down when you import less code, import from less places, or import code from less authors. That's it.
Prince Evropa No.105618864
>>105616701 (OP)
What are dependencies?
Prince Evropa No.105618870
>>105616837
I read this is a man's voice who has psychotic delusions of being a woman.
Anonymous No.105618891 >>105618947 >>105619025
>>105618851
you didn't say anything different from me.

>>105618793
Nobody's talking about or cares about code reuse, sperg. They're talking about the bureaucratic overhead induced by dependencies. Supply chains.
Have you never been subjected to repo scans in your life? Who's the cnile now?
Every single dependency is an opportunity for an event outside of your control to shut you down.
Anonymous No.105618939
>>105618831
You will never be part of OpenBSD
Anonymous No.105618947 >>105618956 >>105618974
>>105618891
I gave an exhaustive list of things impacting attack surface. You gave a shitty incorrect criteria you yourself can't even define.
A bundle of 1000 dependencies amounting to 10k locs made by 1 non-first-party (whatever that means) dev downloaded from their own site
is less attack surface than 2 dependencies amounting to 11k locs made by 2 first-party developers (whatever that means) downloaded from 2 different sites
by my every metric. It is more by your metric though. So yes, I did say something different.
Anonymous No.105618956 >>105619036
>>105618947
>downloading random code you don't audit from the internet at compile time is actually a security feature!
Anonymous No.105618974 >>105618985 >>105619054
>>105618947
first party means made by the crablang maintainers
It doesn't have to be in the standard library
Literally all I asked for was a promise that libraries for basic shit, their dependencies, and their developers undergo a similar level of scrutiny that the actual language does.

You said a bunch of word salad because you're angry and you feel like you need to make a point, but all I'm trying to do here is get across that you're really just not trying to understand people at all.
Crablang users wonder why everyone hates crablang when their evangelists act like blender fanboys prior to 2.8.
Anonymous No.105618985 >>105618993
>>105618974
also no need to reply I'll be dumpstering this designated shitting thread since I can't expect civilized behavior from zealots
Anonymous No.105618993
>>105618985
>lose argument
>leave
>except I can't leave without having the last word
Anonymous No.105619025
>>105618891
Dependencies are code reuse, retard. Look at OP's pic, on the right every line ending in "(*)" is a reused dependency, reused code.
>They're talking about the bureaucratic overhead induced by dependencies
DEPENDENCIES. Not MICROdependencies. See how these words are of different length? That's because they are different words.
You can a problem with dependencies, but that is not what you were discussing. You were discussing microdependencies. And don't extrapolate every problem with having a dependency as being present in the same quantity when having a microdependency.
Again, cutting more slices don't make more pie
Anonymous No.105619036
>>105618956
Anonymous No.105619054
>>105618974
Okay, now that you clarified I can for certain say your measure for attack surface is completely incorrect
Anonymous No.105619066 >>105619110 >>105620257
>>105616880
URLs aren't a "mess", although you can't just make assumptions and just have to actually read the standard. eg you can't just assume a TLD exists, the standard doesn't say every domain must have a dot
Purely parsing a URL is only 25 pages
https://www.ietf.org/rfc/rfc1738.txt

If you want to handle punycode for whatever reason. The rfc document for that is 35 pages and includes a C89 implemention
https://datatracker.ietf.org/doc/html/rfc3492
Anonymous No.105619106
>>105616701 (OP)
Most of those crates are written by Rust core contributors and many are part of the same projects. All of the icu crates are part of the same project but they're split into small crates so you can just use what you need. Real icu support is extremely hard and most languages other than Rust and Swift handle it terribly.
Anonymous No.105619110 >>105619143 >>105619254
>>105619066
>December 1994
https://url.spec.whatwg.org/
Anonymous No.105619113 >>105619127
>>105616740
imagine trying to shill your language by drawing comparisons to a C version from over 20 years ago. what a joke
Anonymous No.105619126 >>105622715
>>105616701 (OP)
Rust is just a corpo jeetlang pure and simple. It's meant to be constrained enough that the 85IQ H1Bs you hired for cheap can't bring the your entire microservice webshit service nightmare crashing down with a single bad commit, but unlike whatever flavour of Javashit still have enough performance that the user doesn't need to wait 30 seconds to load 800 bytes of data (just 5 seconds instead).

Have you ever heard a single advantage of Rust that wasn't rendered pointless by simply not being a retard?
>I-IT'S MEMORY SAFE!
So is my C code. Managing memory really isn't hard.
>Y-YES IT IS BECAUSE IT'S A HUGE PROJECT!
You're working on a fucking smartphone app Ranjeet the only reason it's a huge project is because there are 8 managers for every programmer and 12 hours of meetings for every hour spent actually programming. You need 15 microservices and 500MB RAM to load 5 rounded buttons.
Anonymous No.105619127 >>105628077
>>105619113
There are no significant projects written in anything beyond C99. This is like Java developers that talk about Java 24 when 99.99999999% of Java developers are on Java 11 or lower and have no plans of changing.
Anonymous No.105619143
>>105619110
There’s a lot of Unicode stuff in there. What about URIs?
Anonymous No.105619254 >>105619277
>>105619110
It's a good thing that's backwards compatible with the 1994 standard and the people writing that were just wasting everyone's time
Anonymous No.105619277
>>105619254
it's a good thing that you understand backwards compatability
Anonymous No.105619709 >>105620324 >>105624162
>>105616701 (OP)
Parsing URI is not simple. If you wrote code professionally you would know.
Anonymous No.105620232
>>105616701 (OP)
And?
What does it matter?
I'm serious.
Anonymous No.105620257
>>105619066
>only 25 pages
>only
>25
>pages
kek.

They're you go.
>To achieve that thing I need to robustly do the other thing - oh, there's already a library for that - I'll use it so I won't have to do and maintain it myself.
And that's how you get to 44 dependencies.
Anonymous No.105620264 >>105621723 >>105621834 >>105622042
>>105616745
>You have to write it from scratch.
and it won't work either. you forgot that part.
Anonymous No.105620302
>>105616701 (OP)
You don't need any deps for it you can do it in pure rust.

Op probably used a crate to do that and this crate used other for performance reasons.
Anonymous No.105620312 >>105620356
>>105616701 (OP)
This is why I can’t rust seriously.
Anonymous No.105620324
>>105619709
this.
when people say "URL" specification, they can't even agree on what that is:
https://www.more-magic.net/posts/an-appeal-to-whatwg-uri-spec.html
Anonymous No.105620329 >>105620936 >>105623432
>>105616701 (OP)
>cnile is lying again
Anonymous No.105620356
>>105620312
why? because you'd have to take on libidn2.so as a dependency as well?
oh, so you header only compile some unicode normalization tables so you think you can hide that as a dependency as well?
Anonymous No.105620403 >>105620413 >>105621579 >>105621694 >>105622283
>>105618692
Microdependencies are objectively bad, retard.
Anonymous No.105620413
>>105620403
Why?
Anonymous No.105620936
>>105620329
>cGOD is lying again
Anonymous No.105621579
>>105620403
Microdependencies are bad when they're from completely different projects by random people across the globe. Microdependencies written by core Rust maintainers that mostly belong to the same project are fine.

Why does it matter to you that the ICU crates, which all belong to the same project, are separated into smaller crates? That's objectively a good thing because you don't need to pull in a bunch of bullshit like unicode normalization tables unless you absolutely need them. It also allows some of them to work in environments that don't have an std available.
Anonymous No.105621658
>>105617910
I thought lf is written in Go, not C.
>>105618135
https://github.com/gokcehan/lf
Anonymous No.105621694 >>105621737
>>105620403
Couldn't disagree more. Reusing code is a first principle. The only issues I ever see people complain about with dependencies is the tools. Dependencies themselves are never the issues, it's always the resolver or compiler that's at fault for being bad, but not every language has that problem.
Anonymous No.105621723 >>105621741 >>105622969 >>105624096
>>105620264
HOW THE FUCK ARE YOU TOO DUMB TO WRITE AN URL PARSER?
AND YOU ARE PROUD OF IT?
ARE ALL RUST FAGS THIS STUPID?
Anonymous No.105621737 >>105621746 >>105621754 >>105622293
>>105621694
>The only issues I ever see people complain about
The most common issue i see people complaining about is those micro-dependencies being possible to get hijacked or vanish.
Since you seemed to have never saw that complaint, you are either a newfag or a liar.
Anonymous No.105621741 >>105621798 >>105621820
>>105621723
>nocoder thinks urls are just simple ascii strings where components are separated by slashes
dumb retard
Anonymous No.105621746
>>105621737
Sounds like a tooling problem.
Anonymous No.105621754
>>105621737
doesn't happen with cargo
Anonymous No.105621798 >>105621842 >>105622030 >>105622969 >>105624096 >>105624767 >>105628403
>>105621741
>just simple ascii strings
It is not just a simple ascii string, it is a limited subset of ASCII and non-ASCII characters are transmitted in punycode.
If you are too dumb to parse a string, WHY THE FUCK do you believe you have any say in a coding discussion? Parsing a URL isn't some unachievable task that takes years to master.
YOU ARE DUMB AND YOU ARE PROUD OF IT
Anonymous No.105621820 >>105621842
>>105621741
urls are literally ascii, they are required to be.
Anonymous No.105621821 >>105621869 >>105622969 >>105624096 >>105627675
>NOOOO, I NEED A LIBRARY TO PARSE A STRING, IT IS IMPOSSIBLE FOR ME TO DO THAT MYSELF, YOU CAN'T EXPECT FROM ME TO MASTER THAT INCREDIBLE HARD AND COMPLEX TASK OF PARSING A STRING
>IN FACT, THIS TASK IS SO HARD AND COMPLEX THAT THE LIBRARY I USE TO DO IT, ITSELF HAS TO USE A DOZEN OTHER LIBRARIES
Anonymous No.105621834
>>105620264
If anyone ever doubted that Rust fags are the dumbest retards on planet earth, this drives the point home
Anonymous No.105621842
>>105621798
>>105621820
dumb retards
Anonymous No.105621869 >>105621917
>>105621821
Just because you can do something doesn't mean it's a good idea. You'd be silly to rewrite the standard library for each project you make instead of just using the system designed to facilitate code sharing.
What's the joke supposed to be here?
Anonymous No.105621917 >>105621935 >>105621959 >>105622969 >>105624096
>>105621869
>Just because you can do something doesn't mean it's a good idea.
It IS a good idea if you only need to support a specific protocol, which is the case in 99.999% of cases.
Here you got the HTTP URL scheme from the RFC itself:

> The HTTP URL scheme is used to designate Internet resources
> accessible using HTTP (HyperText Transfer Protocol).
>
> The HTTP protocol is specified elsewhere. This specification only
> describes the syntax of HTTP URLs.
>
> An HTTP URL takes the form:
>
> http://:/?
>
> where and are as described in Section 3.1. If :
> is omitted, the port defaults to 80. No user name or password is
> allowed. is an HTTP selector, and is a query
> string. The is optional, as is the and its
> preceding "?". If neither nor is present, the "/"
> may also be omitted.
>
> Within the and components, "/", ";", "?" are
> reserved. The "/" character may be used within HTTP to designate a
> hierarchical structure.

And pro tip: This kept backwards and forwards compatibility since its inceptions.
If your parser doesn't translate punicode into unicode? YOU DON'T HAVE TO GIVE A FUCK ABOUT THAT BECAUSE THEIR URLS WILL WORK ANYWAY. Why would you use a library that implements a million things if only need one?
ESPECIALLY URL parsing is something where doing it yourself pays off and it takes less time to implement it yourself than the time it took you to write that post.
Unless you are a complete utter idiot of course.
Anonymous No.105621935 >>105621983
>>105621917
You sound nuts to be honest. Upset about something for reasons that aren't clear. I don't want to talk with you anymore.
Anonymous No.105621959 >>105621999 >>105622165
>>105621917
https://claroty.com/team82/research/exploiting-url-parsing-confusion
Anonymous No.105621983 >>105621989 >>105621990 >>105624096
>>105621935
I am just telling you that you are a dumb fucking idiot and i am annoyed at you being proud of that.
It is ok to be dumb.
It is ok to be naive.
But it is not ok to shill some manic stuff and pull shit out of your ass for no reason.

Calling an URL parser a "parser" is already an insult to any actual parsers like parsing JSON, markdown, XML or -god protect - HTML5.
One of the first hobby projects i did when i learned coding over 10 years ago required some XML parsing and i did it myself just because. Meanwhile you are overwhelmed by parsing an URL while behaving as if your word matters in a judgement about languages.
Anonymous No.105621989
>>105621983
Sorry to hear that, good luck with school and whatever is troubling you.
Anonymous No.105621990 >>105622012
>>105621983
>calling a parser a parser is wrong because certain parsers don't parse enough
Anonymous No.105621999 >>105622005
>>105621959
>We examined 16 URL parsing libraries
What a surprise. It's almost as if it is not a smart thing to try to support every little fringe scheme out there.
Don't use a fucking general library if everything you encounter is an http.
Anonymous No.105622005 >>105622119
>>105621999
>he didn't even read it
nocoder confirmed
Anonymous No.105622012
>>105621990
By that definition a regexp is a parsers. So i wrote thousands of parsers in my life.
Which is what most URL parsing is: A regexp to confirm that it is a valid URL.
Anonymous No.105622030 >>105622102
>>105621798
https://url.spec.whatwg.org/#url-code-points
>and code points in the range U+00A0 to U+10FFFD, inclusive
hmmm
Anonymous No.105622042 >>105622046
>>105620264
The entire internet was written in C and the backend mostly still is
Anonymous No.105622045
>>105616887
Based webchads win again.
Anonymous No.105622046 >>105622067
>>105622042
It shows, with the countless exploits everywhere.
Anonymous No.105622058 >>105623179
>>105616887
If you run that in Firefox, the Rust crate in OP does the parsing.
Anonymous No.105622067
>>105622046
Uh oh, you better log off then
Anonymous No.105622102 >>105622107
>>105622030
Which get percent encoded just like so many other things, so no big deal.

If you are scared of parsing a string, what do you even code? Wouldn't absolutely everything you do be too hard and overwhelming for you?
Are you capable of writing a loop on your own, without being scared of the danger of infinite loops when the conditions are wrong?
Anonymous No.105622107 >>105622134
>>105622102
>get proven wrong
>ad hominem
typical cnile. post your url parser.
Anonymous No.105622119 >>105622129
>>105622005
That article is about idiots who use multiple URL libraries to parse the exact same thing, because they are too scared of the hard task of parsing an URL themselves.
Anonymous No.105622129
>>105622119
post your url parser.
Anonymous No.105622134
>>105622107
what got proven wrong?
Anonymous No.105622150 >>105622172 >>105622194
>>105616701 (OP)
I remember zero day attacks on some browser(chrome?) exactly because of small mistakes in URL parsing.
Are you sure your URL parsing is 100% compliant? This shit isn't easy.
Anonymous No.105622165 >>105622306 >>105622330
>>105621959
>i used a third party library for printing a log to stdout
>that third party library included a million features i knew nothing about
>so that libary parsed ldap urls and fetched and executed them
>i discovered this a decade later
>its a remote execution vulnerability... but it kinda isn't because it is an intentional feature of the library i used
>i reacted to this by using a different library to parse parts of the string before it gets parsed by the logger to filter it
>but that different library implemented the url differently
>so someone could inject urls into the urls and and use the library parser to get something to the other parser library used by that logging library
or you could have just printed your log to stdout
Anonymous No.105622172
>>105622150
um actually urls are just ASCII and writing a parser is simple. Just eyeball the RFC from 30 years ago and split the input string and you're good
Segmentation fault (core dumped)
Anonymous No.105622194 >>105622235 >>105622284
>>105622150
damn, if even Google with their billions of USD produced zero day remote code execution exploits, there is totally no hope for us, do you trust the 44 authors behind your url parsing more than google? I sure don't. They don't have the money and skill.

I am also quite certain that you can't find those supposed "zero day attacks", but let reality not stop us.
Anonymous No.105622235 >>105622291
>>105622194
https://www.cve.org/CVERecord?id=CVE-2025-1211
you can find more by using any internet search engine
Anonymous No.105622283
>>105620403
A function is a microdependency
Anonymous No.105622284 >>105622300 >>105622302
>>105622194
log4j was caused by URL parsing confusion.
Anonymous No.105622291 >>105622295 >>105622302
>>105622235
>not chrome
>not a browser
>no "zero day attacks"
>no known attack
>not even a vulnerability, but only the possibility of a vulnerability if someone else uses the library for parsing and does weird checks with it
wow, that huge list of vulnerabilities that happened because some idiot imported a huge parsing library surely speaks in favor of... importing huge parsing libraries for URLs
Anonymous No.105622293
>>105621737
Since you haven't seen it has already been adressed in this very thread, multiple times over, you are either blind or illiterate
Anonymous No.105622295 >>105622319
>>105622291
>I can't use a search engine
https://issues.chromium.org/issues/40093865
Anonymous No.105622300 >>105622314
>>105622284
log4j was caused by idiots importing shit they didn't need, without knowing what it does.
Anonymous No.105622302 >>105622306
>>105622291
See >>105622284
Anonymous No.105622306 >>105622328
>>105622302
see >>105622165
Anonymous No.105622314 >>105622330
>>105622300
Ok, let learn something new.

Can you explain how exactly did log4j vulnerability resulted from:
>idiots importing shit they didn't need
Anonymous No.105622319 >>105622337 >>105622340
>>105622295
>if you do this, then an exception is thrown and nothing happens
amazing
Also its not my fault that you claim a huge "zero day attack" on a browser and when questioned about it reply with some non-vulnerability in some weird fringe package.
Anonymous No.105622328 >>105622347 >>105622376
>>105622306
What about it? log4j vulnerability was not "intentional feature of the library". it was due to url parsing confusion.
Anonymous No.105622330 >>105622347
>>105622314
sure, see >>105622165
Anonymous No.105622337
>>105622319
log4j was a real zero day vulnerability
Anonymous No.105622340
>>105622319
>dumb retard can't read
>is convinced he can implement a compliant url parser
nice self own, retard
Anonymous No.105622347
>>105622330
Already replied: >>105622328
Anonymous No.105622353
https://pdw.ex-parrot.com/Mail-RFC822-Address.html
Anonymous No.105622356 >>105622364 >>105622367
Argument:
>don't import libraries that do five million different things if you only have to parse a string
Replies:
>oh yess?? Here look at those vulnerabilities that were caused by people importing libraries without knowing their features and behaviors!!!! This is the reason why you have to use as many libraries as possible!
very compelling, i am now a rust-bro and will vibe code with LLMs and instruct it to import as much as possible
Anonymous No.105622364
>>105622356
post your url parser
Anonymous No.105622367
>>105622356
Which libraries were incorrectly imported by the creators of log4j?
Anonymous No.105622376 >>105622389 >>105622396
>>105622328
Read your own article you posted.
>First initial vulnerability because retards use a library with ldap features when they simply want to log to stdout
>Second vulnerability because they use an epic URL parsing library before log4j uses a different epic URL parsing library, without them knowing how those libraries even parse

Not a single vulnerability you yap about was caused by someone parsing a string himself. But 100% were caused by parsing libraries. Really weird how that works...
Anonymous No.105622389 >>105622449
>>105622376
post your url parser
Anonymous No.105622396 >>105622449
>>105622376
>Read your own article
I didn't posted any article.
>Second vulnerability because they use an epic URL parsing library before log4j uses a different epic URL parsing library, without them knowing how those libraries even parse
How does that make it "intentional feature of the library"? Sounds like url parsing confusion to me.
Anonymous No.105622414
>>105616740
curl
Anonymous No.105622425 >>105622433 >>105622521
Would you be vulnerable to the log4j exploit if you wouldn't have used a library with enabled ldap features when you don't need that?
Nope.

Would you be vulnerable to the log4j exploit if you would have parsed your log strings yourself?
Nope.

Would the patch, done by epic security researches, have caused another vulnerability if they would have parsed the URL themselves?
Nope.

What would a good developer do?
printf()
Anonymous No.105622433 >>105622719
>>105622425
post your url paser
Anonymous No.105622449 >>105622454 >>105622521
>>105622389
>>105622396
I am sure minecraft REALLY REALLY needs to parse URLs in the log they output to stdout and they REALLY REALLY need those epic ldap features.
It is totally not an issue of importing general libraries with a gorillion features.

Faggots who think that they just HAVE TO include 44 micro dependencies to parse a string are bound to run into such issues. You don't know how those 44 libraries behave. So you don't know how they work together.
Anonymous No.105622454
>>105622449
>he can't parse a simple ascii string
nocoder spotted
Anonymous No.105622521
>>105622425
>>105622449
How does that make url confusion vulnerability the "intentional feature of the library"?
Why did they fixed it and raise an alarm if it was intentional?
Something doesn't add up.
Anonymous No.105622715
>>105619126
Based and chadpilled.
Anonymous No.105622719 >>105624779
>>105622433
[package]
name = "blahaj-parse-oxide"
version = "0.4.1"
edition = "2024"

[dependencies]
url = "2.5.4"

ada-url = "3.2.4"
url-parse = "1.0.10"


#![feature(random)]

use std::random::random;
use std::sync::Arc;

use url::Url as UrlUrl;
use ada_url::Url as AdaUrl;
use url_parse::url::Url as UrlParseUrl;
use url_parse::core::Parser;

#[derive(Debug, PartialEq)]
enum Url {
UrlUrl(UrlUrl),
AdaUrl(AdaUrl),
UrlParseUrl(UrlParseUrl)
}

fn parse_url(input: String) -> Option>> {
let parser = Parser::new(None);
match random::() & 3 {
0 => Some(Arc::new(Box::new(Url::UrlUrl(UrlUrl::parse(&input).unwrap())))),
1 => Some(Arc::new(Box::new(Url::AdaUrl(AdaUrl::parse(input, None).unwrap())))),
2 => Some(Arc::new(Box::new(Url::UrlParseUrl(parser.parse(&input).unwrap())))),
_ => return None
}
}

fn main() {
let url = parse_url("https://boards.4chan.org/g/thread/105614169".to_owned());
}
Anonymous No.105622741 >>105622862
>>105618373
Seems he didn't learn his lesson, bignum this anon.
Anonymous No.105622834
>>105618610
>in other words you don't have a real strategy and crablang's attitude towards basic shit like not making more work for other people is not encouraging.
>making more work for other people is actually LE GOOD
Why do you use a programming language? Why don't you write machine code directly in binary?
Anonymous No.105622862 >>105622950
>>105622741
NTA, how is bignum related to any of this?
He is right in that leftpad incident couldn't happen in cargo. Do you even understand why this is the case or are you just a nocoder throwing random shit and seeing what sticks?
Anonymous No.105622871 >>105622883 >>105623597
This is actually the main reason I never got into rust. I followed a tutorial once, and afterwards the project dir had ballooned to like 300MB.
Anonymous No.105622874 >>105622912 >>105622964
>>105617883
>the async cancer from JS.
Let's see how would you accomplish concurrency in embedded C.

>inb4 FreeRTOS bloat
Anonymous No.105622883
>>105622871
Good. People who can't manage their dependencies shouldn't use Rust.
Anonymous No.105622912 >>105622924
>>105622874
async is not concurrency, common misconception
Anonymous No.105622924 >>105622954
>>105622912
What?
Anonymous No.105622950 >>105622973 >>105623078 >>105623581
>>105622862
An NPM packaged called bignum was used to serve malicious code for scraping credentials.
The point still is that all these cascading microdependencies are a crazy way to do software development.
Anonymous No.105622954 >>105623078
>>105622924
It can be implemented using concurrency, but, in principle, it is not concurrency.
In other words, concurrency is an implementation detail of asynchrony.
Anonymous No.105622964 >>105623078 >>105631912
>>105622874
You mean fearless concurrency, surely.
Anonymous No.105622969 >>105622983
>>105621723
>>105621798
>>105621821
URLs are a well-defined specification. The proper thing to do is have a complete implementation that handles everything properly, instead of making everyone half-ass their own implementation.

>>105621917
That's an example of half-assed. URLs are more complicated and passing in an invalid or malformed URL is how you get exploited like phishing.
Anonymous No.105622973
>>105622950
Wrong. A S3 bucket was used to serve malware.
Anonymous No.105622983 >>105623183
>>105622969
>URLs are a well-defined specification
wrong
https://github.com/bagder/docs/blob/master/URL-interop.md
Anonymous No.105623078 >>105623202 >>105631893
>>105622950
>The point still is that all these cascading microdependencies are a crazy way to do software development.
Having fewer, but larger dependencies is more dangerous. XZ happened even though it was supplied by a regular package managers. Having no cascading micro dependencies didn't prevented any of this from happening. What more, it made it even more difficult to prevent attacks like this. Cargo actually pins versions of your dependencies. You can review all your dependencies and be certain you are safe. On the other hand, if you rely on system's package manager, such vulnerability can be added retroactively to your application and you can't rely on dependency audits. Additionally, with micro dependencies, the libraries you depend on are likely to be much smaller and easier to audit. They often share dependencies so you do not have to audit hundreds of different linked list implementations. In general, it reduces the amount of code you have to audit significantly in contrast to using large, batter-included dependencies that you often see in languages with poor dependency management. Whenever you give people option to efficiently manage dependencies, they will gravitate to using small, specialized libraries because it IS the most sane way to do software development, so called, Unix Philosophy.

>>105622954
>it is not concurrency
[citation needed]

>>105622964
No, I mean just concurrency. You can't have fearless concurrency in C, as in, it is easy to cause UB in C when doing concurrency. FreeRTOS does not give you that either.
Anonymous No.105623179
>>105622058
JavaScript fags absolute BTFO.
Anonymous No.105623183 >>105623206
>>105622983
There's an algorithm for parsing them.
https://url.spec.whatwg.org/#concept-basic-url-parser
Anonymous No.105623187 >>105623207
>>105616751
How would you make it memory unsafe? Serious question. I've been coding C++ for years and this has never even been an issue. How are people making things unsafe? What is unsafe? What the fuck are you doing that's unsafe?
Anonymous No.105623202 >>105623236
>>105623078
>Whenever you give people option to efficiently manage dependencies, they will gravitate to using small, specialized libraries because it IS the most sane way to do software development, so called, Unix Philosophy.
Unix Philosophy is when all those libraries are separate processes and you have to convert everything to a text stream and pass it on stdin/stdout. Unix shills try to claim credit for everything. Libraries are the way people did things before the Unix philosophy.
Anonymous No.105623206
>>105623183
try reading
>TWUS: Doesn't specify IDNA 2003 nor 2008
Anonymous No.105623207 >>105623217 >>105623516
>>105623187
NTA, unsafe means that it can manifest observable undefined behavior.
Anonymous No.105623217 >>105623225 >>105623294
>>105623207
wut
99% of the time if you do something dumb it just crashes.
Anonymous No.105623225
>>105623217
Crashing can result from UB ridden code, yes.
Anonymous No.105623236 >>105623253
>>105623202
>separate processes and you have to convert everything to a text stream and pass it on stdin/stdout
Yup, these things also fall under Unix Philosophy definition.
Anonymous No.105623253 >>105623266 >>105623452
>>105623236
>Yup, these things also fall under Unix Philosophy definition.
That's not how libraries work though. Calling libraries "Unix Philosophy" is lying and also shilling because Unix is a registered trademark of proprietary software.
Anonymous No.105623266
>>105623253
I didn't called libraries "Unix Philosophy"
Anonymous No.105623294 >>105627301
>>105623217
If you're using Python or JavaScript where you have runtime checks and don't catch the exception, it crashes, but in C it does something random and unpredictable and you usually have no idea what happened. It's how N64 games have those crazy glitches that can let you run arbitrary code or turn items into other things or warp you to the end credits.
Anonymous No.105623394 >>105623477
>>105616751
Yes
>>105616751
Yes, it only happens to bad programmers like the gnome team
>>105616837
Is a short script
Anonymous No.105623432 >>105623491
>>105620329
>saar you don't need 100 just 10 to do a simple task like parsing url
Anonymous No.105623447
>>105616701 (OP)
Why not just a regex
Anonymous No.105623452
>>105623253
>and also shilling because Unix is a registered trademark of proprietary software.
yeah i'm sure someone is gonna go out and buy an HP-UX license because of his subtle viral marketing
Anonymous No.105623477
>>105623394
>Is a short script
Post it
Anonymous No.105623482
>Doing One Thing, Well
Except when it comes to source code then you have to do everything yourself and never use anything others have made.
Anonymous No.105623491 >>105623509 >>105623600
>>105623432
Parsing URL is not simple. You think it's simple because you haven't read the spec yet.
Anonymous No.105623509 >>105623545 >>105623563
>>105623491
>the spec
"the spec" is what chromium does
Anonymous No.105623516 >>105623537
>>105623207
>observable
>undefined
good morning sir
Anonymous No.105623537 >>105623625
>>105623516
Yes, undefined behavior can be observed. Do you even know what undefined behavior is?
Anonymous No.105623545
>>105623509
Then it's definitely not simple.
Anonymous No.105623554
>>105616837
have you seen any repo listed in "awesome rust"?

half of them are abandoned
Anonymous No.105623563 >>105623633
>>105623509
https://chromium.googlesource.com/chromium/src/+/HEAD/url
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
C++ 37 1816 2608 10529
C/C++ Header 19 701 1869 2886
Anonymous No.105623581 >>105623869
>>105622950
>The point still is that all these cascading microdependencies are a crazy way to do software development.
Then argue that point. Because you were not. You were arguing cargo is as bad as npm, citing example npm incidents that could not happen with the way cargo functions.
This is why I called you a cargo cultist. You do not understand WHY npm has these problems, all you understand is that it has these problems, and it has microdependencies. So like the dumb islander who doesn't understand why the plane comes, but does understand that it does and how the people it comes to act assumes how they act is why it comes, you assume microdependencies are why these problems happen
Anonymous No.105623597
>>105622871
Good. No brokie larpers on their stinkpads. Stick to tinkertrannying
Anonymous No.105623600 >>105623666 >>105623677
>>105623491
Parsing anything is simple. Go write a shitty little dsl or 2 and you're golden. You will never have to touch regex or someone else's parsing library again in your life.
Anonymous No.105623625 >>105623659
>>105623537
yes rajesh
do you?
Anonymous No.105623633 >>105623723 >>105628566
>>105623563
>95% tests and shit
yeah
Anonymous No.105623659 >>105623671
>>105623625
>do you?
Of course, are you blind? You asked when unsafe means and I used undefined behavior in the definition. So of course I know the term.
However considering you do know it as well(at least claim to do so) and then decide to call me an indian, I guess that's your form of admitting lack of fuhrer arguments. All I can answer to that is:

I accept you concession.
Anonymous No.105623666
>>105623600
Parsing stops being easy when the thing you're trying to parse is needlessly convoluted, at least if you need to handle edge cases.
Anonymous No.105623671
>>105623659
>when
what does*
Anonymous No.105623677 >>105623703
>>105623600
Post your url parsing code.
Anonymous No.105623703 >>105623710 >>105623721
>>105623677
No I'm not posting my github/code from there to you data farming faggots so you can dox me lol. Just go learn some basic lexing/parsing techniques and you'll be good.
if you're a poor fag:
https://craftinginterpreters.com/
if you have some cash (or pirate it from Anna's archive):
https://interpreterbook.com/
Anonymous No.105623710 >>105623715
>>105623703
nocoder spotted
Anonymous No.105623715
>>105623710
Is learning a new skill too much for you?
Anonymous No.105623721
>>105623703
Post pastebin with the code.
Anonymous No.105623723
>>105623633
>>95% tests and shit
No it's not. Tests modules are under tests/
Anonymous No.105623869 >>105624029 >>105624645
>>105623581
No I think having a whole package just to leftpad a string is pretty evident of the problem with microdepdenencies being a whole mess of packages to do such simple things that can be accomplished with a for loop. This is clearly a problem to do with these types of ecosystems.
Anonymous No.105623958
>yet another thread in which cniles get absolutely BTFO by rust trannies
seriously embarrassing
Anonymous No.105623970 >>105623992 >>105624020 >>105624052 >>105624066 >>105624073
Rustrannies are too incompetent to copy picrelated without splitting it into 9999 packages btw.
Anonymous No.105623992
>>105623970
tooling issue
https://crates.io/crates/cargo-onefile
>Cargo Onefile is a Rust tool that generates a single file containing all the source code of a Rust project
Anonymous No.105624020
>>105623970
Why would you put 44 or whatever crates into one big source file? Are you stupid?
Anonymous No.105624029
>>105623869
>This is clearly a problem to do with these types of ecosystems.
NTA, how exactly does this issue translates to Cargo and what are the real world consequences of it?
Anonymous No.105624052 >>105624066
>>105623970
>https://github.com/ada-url/ada
>gay unicorn
kek
Anonymous No.105624066
>>105623970
>>105624052
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
C++ 15 764 1521 15180
CMake 1 8 1 58
Anonymous No.105624073 >>105624121
>>105623970
None of the cniles can ever explain why separating a library into multiple crates is bad. Let's talk specifically about the icu crates included in the OP. Why would you want that to be one larger crate? The way it's organized now is far superior because you can use most of it in no std environments and/or in places where no allocation can occur. You can also avoid unicode normalization tables if you don't need them. You don't even gain anything by combining it all into one crate because you can already get that functionality by depending on the parent workspace crate directly.
Anonymous No.105624089 >>105624096
Do cniles genuinely think parsing url is splitting an ASCII string by a delimiter?
I cannot understand their primitive way of thinking
Anonymous No.105624096 >>105624115
>>105624089
yes, see >>105621723 >>105621798 >>105621821 >>105621917 >>105621983
Anonymous No.105624115
>>105624096
kek
who said cniles can't entertain
Anonymous No.105624121 >>105624126
>>105624073
Yeah bro I totally want bloat like URLs in bare metal environments, especially in Rust...
Anonymous No.105624126
>>105624121
Yes, that's extremely useful actually.
Anonymous No.105624162
>>105619709
>Parsing URI is not simple.
and? the parsing is done by the library itself and all none of the dependencies help with the parsing
Anonymous No.105624169
>all none of the dependencies
*none
Anonymous No.105624645 >>105624771
>>105623869
Then why bring up bignum? It does not meet your criteria of "packages to do such simple things that can be accomplished with a for loop"
how is that relevant to npm at all? a leftpad-sized dependency doesn't require the dependency management capabilities of npm
Again. You do not understand WHAT you're against. You've moved the goalposts multiple times. You're searching in the dark for a valid justification for your dislike of cargo. There probably isn't any and you've been mindbroken by /gpol/ into being obsessed with tranny dick
Anonymous No.105624767
>>105621798
So you can bang out a flawless parser for any language on the spot? I mean it's just "string parsing" right?

Any idiot can write a url parser that sort of works, but unless you're okay with it breaking on edge cases you should use a standard library.
Anonymous No.105624771
>>105624645
Okay, so you have two problems now. Extremely small packages to do easy things and supply chain attacks. Congrats I guess?
As for your last sentence, lol possibly even lmao.
Anonymous No.105624779
>>105622719
So you don't care that it just returns None 25% of the time?
Anonymous No.105624859 >>105624896 >>105625011
Why hasn't a resident cnile posted their ez no-import url parser?
Anonymous No.105624896
>>105624859
Because it depends on the url, retard.
Anonymous No.105624952
If your URL parser isn't used in a major browser it's almost certainly shit and probably has massive problems. Therefore basically all URL parsers outside of the chromium C++ and Firefox Rust url parsers are piles of shit that will fuckup massively on edge cases.
Anonymous No.105625011
>>105624859
Because cniles don’t really code. They’re busy doing K&R exercises.
Anonymous No.105625015 >>105625397
>>105616751
Of course. All arrays are fixed size, any URL too long is rejected, and that is good because is probably some kind of DOS attack.
Anonymous No.105625397
>>105625015
based static allocator
Anonymous No.105627301 >>105627329 >>105627450 >>105627508 >>105628452
>>105623294
>but in C it does something random
No it doesn't. If you access memory you're not supposed to the OS kills the app.
Anonymous No.105627329 >>105627369
>>105627301
wrong
Anonymous No.105627369 >>105627381
>>105627329
Anonymous No.105627381 >>105627417 >>105627501
>>105627369
https://cwe.mitre.org/data/definitions/125.html
Anonymous No.105627417 >>105627425
>>105627381
Nice theory for college pajeets but in the real world your code isn't going to magically jump out of the vector you're using. Most of the time the only real issue is accidentally reading an unassigned variable and getting garbage, but just use a sanitiser to check for them or just don't be a pajeet.
Anonymous No.105627425 >>105627438
>>105627417
>real world
https://www.cve.org/CVERecord?id=CVE-2014-0160
Anonymous No.105627438
>>105627425
> It resulted from improper input validation (due to a missing bounds check) in the implementation of the TLS heartbeat extension
Like I said just don't be a pajeet. Thankfully C++ has bound checks on everything in the STL.
Anonymous No.105627440 >>105627452 >>105627477 >>105627484 >>105628512
I unironically think the awful dependency management of C/C++ is a good thing. It leads to to people only pulling in dependencies which are actually useful, and you never get instances of pulling a library which ends up transiently pulling 4 other things and so on. Ultimately, external dependencies are a security risk but it comes down to trust. No one is actually completely auditing their external libraries, and even if they are, real backdoors or just other accidental vulnerabilities aren't something you just catch with a "code audit". It's much easier for me to just trust libcurl than it is to trust the 100+ micro dependencies from ~100 authors.

I think there is probably a middleground (I've found many python to have lots of examples of zero to very few dependencies, for example) but when dependencies are too easy to add and the language lacks a functional stdlib you get node and rust.
Anonymous No.105627450 >>105627488 >>105627603
>>105627301
>If you access memory you're not supposed to the OS kills the app.
Anonymous No.105627452
>>105627440
Because pajeets don't code they just put lego bricks together. I hate web devs so much it's unreal.
Anonymous No.105627465 >>105627505
>>105616740
When I was in college I wrote a URL parser in C drunk one evening for fun. No dependencies except for the standard library.
Anonymous No.105627477
>>105627440
>security risk
Security is a surface level thing. *YOU* are responsible for the behavior of the composite software system and when you depend on other software you delegate that responsibility. If you delegate it to strangers you don't know what your software will do. It might do something dumb. It might do something dangerous.
Anonymous No.105627484 >>105627555
>>105627440
>It's much easier for me to just trust libcurl than it is to trust the 100+ micro dependencies from ~100 authors.
This kind of thinking lead to XZ attack. No amount of audit would prevent your program from getting vulnerability like this added retroactively.
No, you can't rely on well known libraries to be safe. If you want to ensure your program is not exploitable, YOU should audit all your dependencies. And Rust version pinning and relative small size of dependencies make it much easier.
Anonymous No.105627488 >>105627505
>>105627450
> support diversity
> let's you modify out of bounds array
Checks out. Just don't be a troon.
Anonymous No.105627501 >>105627512
>>105627381
>HEY EVERYONE! DID YOU KNOW PEOPLE ARE BAD AT KEEPING TRACK OF INDEX INVARIANTS!???! YEAH SOMETIMES PEOPLE FORGET Z < X
Anonymous No.105627505 >>105627522
>>105627465
Post the code.

>>105627488
>say something wrong
>get corrected
>start screeching something about sexuality
Why is this so common among cniles?
Anonymous No.105627508 >>105627522
>>105627301
The OS doesn't always catch it.
Anonymous No.105627512
>>105627501
Who are you quoting?
Anonymous No.105627521 >>105627530 >>105628566 >>105629813 >>105632697
It's honestly sad reading posts from people who think that writing their own URL parser is some kind of impossible task. I mean, it's not something I'd write in an hour, but like, the golang implementation of url parsing is all in one ~1000 line file, probably closer to like 500 once the comments are stripped out. So yes, it is legitimately quite easy It just shows how so many people have self actualized their own incompetence, that they think it's actually something hard to do.

Also, all the vulnerabilities posted have literally nothing to do with URL parsing -- it's all the same issues with C strings which has existed in all C code that takes in any user input.
Anonymous No.105627522
>>105627508
>>105627505
Correct this dumb fuck.
Anonymous No.105627530 >>105627582
>>105627521
>all the vulnerabilities posted have literally nothing to do with URL parsing
log4j was caused by url parsing confusion
Anonymous No.105627555 >>105627618 >>105627624
>>105627484
You are an idiot if you think trust has nothing to do with modern security. For any non trivial code, it's impossible to audit it for security / accidental vulnerabilities with 100% confidence, especially in a language like C.

And that is completely ignoring everything else that goes into building a program. Have you audited the source code for your compiler? How about whatever tool you are using for dependency management? How can you know those aren't doing something malicious?
Anonymous No.105627582 >>105627605
>>105627530
No it wasn't.
Anonymous No.105627603 >>105627620
>>105627450
>accesses memory he's clearly allowed to access
i dont know what point you're trying to make here anon
Anonymous No.105627605 >>105627636
>>105627582
It was. It parsed urls slightly differently in two different places, one being the validator and other that actually used the url. This allows attacker to bypass validation and trigger RCE using JNDI with a specially constructed url.
Anonymous No.105627618
>>105627555
dont forget your file system, your os, every piece of firmware you rely on, etc.
Anonymous No.105627620
>>105627603
>>>i dont know what point you're trying to make here anon
>>but in C it does something random
>No it doesn't.
It does something random as presented.
Anonymous No.105627624 >>105627630
>>105627555
>it's impossible to audit it for security / accidental vulnerabilities with 100% confidence, especially in a language like C.
Yes you can and code audits are common in safety critical software.
Anonymous No.105627630
>>105627624
audits are common, but they definitely don't catch 100% of vulns
Anonymous No.105627636 >>105627649
>>105627605
What's your point? It parsed the URL just fine, accurately, and completely safely. The fact that the URL lead to a malicious object isn't the URL parser's problem.
Anonymous No.105627649 >>105627656 >>105627678
>>105627636
>What's your point?
That
>all the vulnerabilities posted have literally nothing to do with URL parsing
is wrong because
Anonymous No.105627656 >>105627678
>>105627649
because...
>log4j was caused by url parsing confusion
Anonymous No.105627675
>>105621821
ok, go spend hours reading up the specs for internet url's and writing and testing your implementation instead of using an existing one
what's the point of foss if you aren't going to use shared code?
Anonymous No.105627678 >>105627699
>>105627656
>>105627649
Please pay attention. There was no confusion at any point. The library added RCE as a feature, and it worked exactly as they had it implemented.
Anonymous No.105627699
>>105627678
>The library added RCE as a feature, and it worked exactly as they had it implemented
[citation needed]
Anonymous No.105628045
It's kind of amazing how quick rust is dying. It was so promising
Anonymous No.105628077 >>105628232
>>105619127
even if that'd be true, in c you would just use curl
Anonymous No.105628232 >>105628313
>>105628077
No I wouldn't, curl is bloated piece of shit just like modern web, when I need webslop, I use python.
>b-b-but
when I want to write network code in C, I make my own protocol for a very specific problem and its solution, fuck you.
Anonymous No.105628313 >>105628333 >>105629669
>>105628232
i was going by the logic of using dependencies instead of making your own custom solution, due to the people ITT insisting that writing a URL parser is way too complicated for a person to do.
even in the case that they're correct, in C you would use one dependency and be done with it. No bloatmaxxed quadruple-digit dependency hell required
Anonymous No.105628333 >>105628626
>>105628313
it's just cargo add url though
what;s the problem?
Anonymous No.105628403 >>105628421
>>105621798
Some programmers don't know anything other than do_thing(), mostly because they didn't learn DSA or can't use it in practice
Anonymous No.105628421 >>105628455
>>105628403
post your url parser
Anonymous No.105628452
>>105627301
Wrong. If the CPU supports this feature at all, (remember, muh poor table ass enby) and paging is on, and if you're in user mode or SMAP is on, and you access a page you don't have permission for, then you enter an interrupt and kernel decides what to do next.
Anonymous No.105628455 >>105628474 >>105628590
>>105628421
Feel free to validate your mediocrity by the fact I'm unable to write an URL parser in five minutes, neither I will waste a few hours to prove you wrong
Anonymous No.105628474
>>105628455
>I'm unable to write an URL parser
as expected of a cnile
Anonymous No.105628512
>>105627440
>only pulling in dependencies which are actually useful
>you never get instances of pulling a library which ends up transiently pulling 4 other things and so on
You instead get instances of pulling a library which simply bundles the 40 subdependencies it would have into itself. Then you pull a second one and it does the same, except if you actually separated it out 20 of their dependencies would be common with the first library.
>It's much easier for me to just trust libcurl than it is to trust the 100+ micro dependencies from ~100 authors.
You simply pulled 1 dependency from ~1000 authors. Dependencies are not equivalent, you don't get more pie by cutting it into smaller slices.
Anonymous No.105628566
>>105627521
What sad is reading retards that think complexity is determined by the number of lines. Or clueless retards like >>105623633 that see a fuckton of tests covering one area and think it means that area is simple
Post your url parser so we can post an edge case it doesn't handle properly
Anonymous No.105628590
>>105628455
This thread has been up for a day, we'll wait. If it expires you can make a new one when it's done, don't worry we'll find and bully you
Anonymous No.105628626 >>105628654
>>105628333
bad ragebait here's your (you)
don't reply to me
Anonymous No.105628654
>>105628626
>he can't articulate his problem
either post your url parser or clearly state what the problem is.
until then, you have outed yourself as a nocoder.
Anonymous No.105629186
Imagine working at a job with a cnile spastic who thinks that their unique take on a URL parser is what is needed to move the overall project forward
Anonymous No.105629669 >>105629673 >>105632017
>>105628313
>writing a URL parser is way too complicated for a person to do
Well, post your attempt. Let's see how well does it conform to the standard.
Anonymous No.105629673 >>105629693
>>105629669
>the standard
what standard
Anonymous No.105629693 >>105630055
>>105629673
https://url.spec.whatwg.org/
Anonymous No.105629813
>>105627521
The Golang URL parser doesn't support the entire modern spec and fails on tons of edge cases that browsers need to support. It's also slow as fuck.
Anonymous No.105630055 >>105630105 >>105630835
>>105629693
chrome doesn't honor this standard
what's the point?
Anonymous No.105630105
>>105630055
>what's the point?
The purpose of a standard is to establish a consistent set of criteria, guidelines, or specifications that ensure quality, safety, interoperability, and efficiency in products, services, and processes.
Anonymous No.105630835
>>105630055
huh? but I though standard is a magic spell that engraves rules into reality and enforces consistency over all implementations
Anonymous No.105631400 >>105631810 >>105632031
https://www.youtube.com/watch?v=xC8qfXxAhAw
>watch some yt
>oh an RCE in ASUS motherboards
>thanks two different URL parsing fuckups you can RCE any executable on a target system with ASUS's (by default installed) shitware
>4 days ago
AHAAHAHHA
IT WAS ONE OF YOU FUCKING CNILES WASN'T IT
DID YOU GET FIRED AND WENT HERE TO BITCH THAT URL PARSING IS TRIVIAL ACTUALLY NAH PROBABLY LOST THE INTERSHIP HUH?
Anonymous No.105631772
>>105616837
>just put the security of your code in other people's hands (troons working for the cia)
no thanks
Anonymous No.105631810
>>105631400
Lmao
Anonymous No.105631845 >>105631859
>>105616701 (OP)

Rust binaries are hundreds of times larger than c, I dont see the point in such a bloated language
Anonymous No.105631859
>>105631845
They are not if you use the same linking strategy.
Anonymous No.105631893 >>105631912
>>105623078
>You can't have fearless concurrency in C
Rustfags sure love using meaningless buzzwords
Anonymous No.105631912
>>105631893
The first one to use this term ITT was >>105622964
I merely replied to him and used same term.
Anonymous No.105631985 >>105632009
>>105616837
>t.webscripter who never wrote actual software
opinion discarded, webshitter need to stay in their lanes while adults run the world
Anonymous No.105632009 >>105632029
>>105631985
>writes and discusses URL parsers
>noo, I'm not a webshitter noooo u
Anonymous No.105632017 >>105632031
>>105629669
>Let's see how well does it conform to the standard
Does your application need to parse all standard URLs?
Anonymous No.105632029
>>105632009
I don't write userspace software so Idgaf
Anonymous No.105632031 >>105632103
>>105632017
No, but small differences in parsing can result in CVEs like log4j or >>105631400
Anonymous No.105632103 >>105632150 >>105632338
>>105632031
Log4j was a result of people adding JNDI features into a logging framework, which were of use to absolutely nobody except for the people who figured out a way to exploit it. Parsing errors have never been the root cause of this vulnerability and I'm not sure why you're trying to prove otherwise, considering it's common knowledge.
And this right here:
>small differences in parsing can result in CVEs
is about as sophisticated of a claim as
>integer arithmetic can result in CVEs
Just because you can't advertise your code as "fully compliant" with some RFC doesn't mean you can't implement a subset of it correctly. This is especially true when it comes to parsers since it is trivial to narrow down general production rules by introducing additional constraints.
Anonymous No.105632150 >>105632236
>>105632103
This isn't true.
Parser mismatch is how you get smuggling like strings with content after NUL in C shitware, http header parsers reading and treating requests wrong, possibly over reading into other content in the stream, etc

Either your parser is fully correct or it's highly likely to be a vulnerability in the making. Postel's law is retarded.
Anonymous No.105632235
>>105616740
About 3 if you don't count standard library stuff.
https://packages.debian.org/sid/libglib2.0-0
Anonymous No.105632236 >>105632713
>>105632150
It is, your parser simply supports a proper subset of a grammar. Feel free to bring up real-world examples where a) there are 2 different URL parsers in a system b) both accept some valid URLs and reject all invalid URLs and c) it can be exploited
Anonymous No.105632338 >>105632479 >>105632507
>>105632103
>Log4j was a result of people adding JNDI features into a logging framework
No. Log4j still uses JNDI and no longer has this vulnerability. Inclusion of JNDI was not the source of this vulnerability. It manifested because there were differences in how URLs are parsed and that allowed an attacker to omit validation and pass malicious links to archive RCE using JNDI.

>Parsing errors have never been the root cause
The root cause of this was URL parsing confusion. This is arguably not the same thing as parsing errors.

>is about as sophisticated of a claim as
>>integer arithmetic can result in CVEs
arithmetic is not a bug. But if your program does integer arithmetic in different way, for example by using different algorithms in different places, it could cause integer arithmetic confusion and this could result in a vulnerability.
Anonymous No.105632479 >>105632515
>>105632338
>Inclusion of JNDI was not the source of this vulnerability
>it manifested … using JNDI
Whoever’s running this bot: please increase its context window and reasoning capabilities. What you have now is quite frankly an embarrassment
Anonymous No.105632507 >>105632533
>>105632338
There was no "parsing confusion" and I don't know why you're trying so hard to pretend that there was. Log4j allowed dynamic lookups based on user-provided input AS A FEATURE. The fix to this vulnerability was to disable it by default. This is the third time ITT someone has to mention it just because you're this desperate to win an argument.
Anonymous No.105632515 >>105632529
>>105632479
>>Inclusion of JNDI was not the source of this vulnerability
>>it manifested … using JNDI
Yup
Log4j still uses JNDI and no longer has this vulnerability, therefore inclusion of JNDI was not the source of this bug. URL parsing confusion was.
Anonymous No.105632522
actual retard holy shit
Anonymous No.105632529 >>105632541 >>105632713
>>105632515
>log4j removed a feature and the bug went away, therefore the feature wasn’t the cause of the bug
Bot or 43 IQ, pick exactly one
Anonymous No.105632533 >>105632713
>>105632507
>There was no "parsing confusion"
https://media.telefonicatech.com/telefonicatech/uploads/2021/1/149144_Exploiting-URL-Parsing-Confusion.pdf
https://claroty.com/team82/research/exploiting-url-parsing-confusion
Anonymous No.105632541
>>105632529
Not an argument
Anonymous No.105632570
>make a new dependency that packages all the 44 dependencies

problem solved
Anonymous No.105632697
Meanwhile in Go, literally everything to do with networking and the Internet is included in the standard library.
>>105627521
Time is money. The less time I write a URL parser the better.
Anonymous No.105632713 >>105632826
>>105632533
This is one article since they have the same content word-for-word. You're trying way too hard.
It also agrees that the root cause was allowing JNDI lookups inside user-controlled strings.
It also gets everything wrong and contradicts itself, just like you did, by claiming that the whitelisting fix didn't work because "multiple parsers were used" while the actual reason was that one of them was straight up wrong and couldn't even delimit the authority part based on trivial rules defined in RFC 3986.

>>105632529
I think it's the latter. Bots don't drop arguments when they get owned and I'm still waiting for his response to >>105632236.
Anonymous No.105632826
>>105632713
>This is one article since they have the same content word-for-word.
This article and paper explain how URL parsing confusion works and how it caused log4j vulnerability.

>You're trying way too hard.
Not an argument.

>It also agrees that the root cause was allowing JNDI lookups inside user-controlled strings.
These lookups are still possible. The vulnerability was caused by URL parsing confusion, which was fixed later on. JNDI lookups were not changed.

>claiming that the whitelisting fix didn't work because "multiple parsers were used"
No, the fix didn't work because there were differences in how these parsers work and this tricked validator to accept url that was then interpreted differently in the part that executes JNDI. This is so called URL parsing confusion.