A city whose citizens mostly drive is less independent than a city whose citizens mostly ride bicycles. Bicycling infrastructure is orders of magnitude cheaper to maintain than the same for heavier, motorized vehicles. It's not just the roadways: you need service stations, tire shops, parking lots and garages. Gasoline engine cars need gasoline distributed to stations all over the place and emissions testing. All of these things take up lots of space because motor vehicles are big.
All that bicycles really need are a (much narrower) right of way and some cheap pavement. Maintenance can be done all at home, even in a small apartment. The apparent independence available to motor vehicle drivers is an illusion afforded by massive private and public investment.
> collect severance and unemployment --- I was not aware this is law in California,
Unemployment benefits in California are capped at $450/week, and you only get 26 weeks of it. It's helpful, but doesn't even cover housing costs for many individuals, let alone for families.
I don't think there's a state law requiring severance. It's often offered by the employer if the terminated employee agrees to sign an NDA.
Both California and federal law require 60 days notice for mass layoffs. The CA WARN Act has more protections for workers than the federal WARN Act, i.e., it also applies to part-time workers.
In California, it is illegal to require a terminated employee to sign an NDA to receive severance paid to waive the required WARN notice. A company attempting to enforce such an NDA would key face judicial sanction in court (including paying the former employee's legal fees) and likely pay fines to the CA Labor Dept as well.
The California WARN act effectively requires 2 months severance for large layoffs at large companies (or 2 months notice, but companies almost always prefer severance).
For what one anecdote is worth: through casual use I've found a handful of annoying UI bugs in Claude Code, and all of them were already reported on the bug tracker and either still open, or auto-closed without a real resolution.
It sounds like what makes the pipeline in the article effective is the second stage, which takes in the vulnerability reports produced by the first level and confirms or rejects them. The article doesn't say what the rejection rate is there.
I don't think the spammers would think to write the second layer, they would most likely pipe the first layer (a more naive version of it too, probably) directly to the issue feed.
* Carlini's team used new frontier models that have gotten materially better at finding vulnerabilities (talk to vulnerability researchers outside the frontier labs, they'll echo that). Stenberg was getting random slop from people using random models.
* Carlini's process is iterated exhaustively over the whole codebase; he's not starting with a repo and just saying "find me an awesome bug" and taking that and only that forward in the process.
* And then yes, Carlini is qualifying the first-pass findings with a second pass.
I guess the broader point I wanted to make is about the people responsible for the deluge of LLM-reported bugs and security vulnerabilities on countless open-source projects (not only on curl): they weren't considerate or thoughtful security researchers, they were spammers looking to raise their profile with fully automated, hands-off open source "contributions". I would expect that the spammers would continue to use whatever lowest common denominator tooling is available, and continue to cause these headaches for maintainers.
That doesn't mean frontier models and tooling built around them aren't genuinely useful to people doing serious security research: that does seem to be the case, and I'm glad for it.
> Now consider the poor open source developers who, for the last 18 months, have complained about a torrent of slop vulnerability reports. I’d had mixed sympathies, but the complaints were at least empirically correct. That could change real fast. The new models find real stuff.
The slop reports won't stop just because real ones are coming in. If the author's right, open source maintainers will still will have to deal with the torrent of slop: on top of triaging and identifying the legit vulnerabilities. Obviously, this is just another role for AI models to fill.
Is that really true? My layman's understanding was that ~10-20% of the calories in a typical American diet comes from crops which need pollinators: grains (which feed livestock too), legumes, root vegetables, leafy greens, mostly can be grown without them, using self pollination or wind pollination.
I mean, of those that do require insect pollination. Apples/pear family, almonds/cherries/plums, cucumbers/melons, some others in seed production (carrots). There are only few examples where non-honeybee pollinators are needed, like tomatoes in greenhouses (otherwise wind is enough).
All that bicycles really need are a (much narrower) right of way and some cheap pavement. Maintenance can be done all at home, even in a small apartment. The apparent independence available to motor vehicle drivers is an illusion afforded by massive private and public investment.
reply