Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You have the wrong impression of Google Translate. Translate was started in 2006 and it sucked. 60% accuracy at best. Janky incoherent sentences. Translate revolutionized the entire field of language translation in 2016. Seriously. Went to 94% accuracy when Jeff Dean joined the project and spurred them to train against all languages at once. It revolutionized the whole research field of language translation. Translate is now better than most high school students after several years of study. I consider Translate to be one of the top-5 accomplishments that Google ever had:

https://translatepress.com/is-google-translate-correct/

Another breakthrough is picture recognition. Did you know you can type in "motorcycles" and do an image search and there is ZERO text associated with many of those pictures of motorcycles? They really are mapreducing those pictures and running image recognition against a 17M open-source image library to assess "motorcycle-ness" of all those pictures! They started talking about picture recognition in the early 2000's but it really began happening in the late 2010's after they hired fei fei li, the mother of image recognition ...

https://profiles.stanford.edu/fei-fei-li



Translate did not suck in 2006, it was a big leap over the competitors at the time. It was really the first time a company had launched productized statistical machine translation at all. Before that there was the Altavista Babelfish which was rules based.

Going neural definitely made a big difference but Translate was called "statml" by the infrastructure internally for years, maybe still is, because just using statistics and having training at all was a big deal back then.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: