Yes. Does OpenSSL ring any bells? How about PHP and its ecosystem? How about Sendmail? Or GNOME.
There have been OSS projects 'push'ed into widespread use by the popularity of a few strong personalities and herd mentality, but have questionable design. As someone with a long history of Perl, I'd lump the entire Perl language and ecosystem into that category, along with CGI and FastCGI which existed far longer than they should have before someone put their foot down and made a better open-source application interface.
Developer adoption is by no means a process by which 'he who has the most money wins', there's plenty of examples of big failures to thrust commercial designs into the public space, and there are plenty examples of shitty OSS designs getting uptake, I'm sure if you ask Linus, he could go on a long rant.
As I mentioned with Carmack's mini-GL/MCD model, new forks often get traction if they solve very real customer problems. In Carmack's case, he was developing on NeXT Hardware, for PCs, and PC consumer cards didn't run OpenGL, but often ran either DirectX or a proprietary driver API. If you are in the video game market, multiplatform portability is a given, so forking OpenGL to contain only the high performance subset for games that could run on consumer hardware was so obvious in retrospect, and solved a real program for game developers, who quickly latched onto it, and for 2nd and 3rd tier 3D accelerator vendors who couldn't get devs to use their proprietary API but could benefit from the easy to implement mini-GL.
I have no clue why they want to fork or subset libc, but before dismissing it out of hand, maybe there's a real compelling reason for it, and if there isn't, it'll have as much shelf life as proprietary Unix forks.
But using ad hominem critique in absence of a concrete proposal seems out of place to me.
Its not really ad hominem, Android itself us a massive pile of hacks (zygote process, project treble, binder, etc) that while it worked for google in fixing their issues it is far from ideal. Also what API has replaced FCGI and CGI in general? Hopefully not in process modules.
What's replaced it is running services or application servers that contain an embedded HTTP handler, and putting a load balancer in front of it.
Also, rather than guilt by association (Android et al, which was done under different circumstances compared with the hundreds of other libraries Google has released), why not actually hear out the concrete proposal and then criticize it?
Saying no one in category X can ever build anything good is just illogical.
Have you used OpenSSL? It sucks. The API returns strange and inconsistent error codes. Error reporting was terrible. Strange non orthogonalities abound. It is a lesson on how not to design an API. The only reason it got traction was it was first and no one bothered to do better.
There have been OSS projects 'push'ed into widespread use by the popularity of a few strong personalities and herd mentality, but have questionable design. As someone with a long history of Perl, I'd lump the entire Perl language and ecosystem into that category, along with CGI and FastCGI which existed far longer than they should have before someone put their foot down and made a better open-source application interface.
Developer adoption is by no means a process by which 'he who has the most money wins', there's plenty of examples of big failures to thrust commercial designs into the public space, and there are plenty examples of shitty OSS designs getting uptake, I'm sure if you ask Linus, he could go on a long rant.
As I mentioned with Carmack's mini-GL/MCD model, new forks often get traction if they solve very real customer problems. In Carmack's case, he was developing on NeXT Hardware, for PCs, and PC consumer cards didn't run OpenGL, but often ran either DirectX or a proprietary driver API. If you are in the video game market, multiplatform portability is a given, so forking OpenGL to contain only the high performance subset for games that could run on consumer hardware was so obvious in retrospect, and solved a real program for game developers, who quickly latched onto it, and for 2nd and 3rd tier 3D accelerator vendors who couldn't get devs to use their proprietary API but could benefit from the easy to implement mini-GL.
I have no clue why they want to fork or subset libc, but before dismissing it out of hand, maybe there's a real compelling reason for it, and if there isn't, it'll have as much shelf life as proprietary Unix forks.
But using ad hominem critique in absence of a concrete proposal seems out of place to me.