Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn't this typical of the "Fake it until you have product/market fit, then automate" ethos?


Pitch:

"x.ai – artificial intelligence that schedules meetings"

Reality:

"A group of low-paid humans (that may-or-may not have been background checked) will read your emails to help you schedule meetings. They will probably not use this information in any way other than intended."

Those feel like two different products that I would make fundamentally different decisions about.


> that may-or-may not have been background checked

thats funny, I'll pass the background check and still trade on the inside information, I'm short your house right now


Maybe, in the VCs minds, a "group of low-paid humans" is equivalent to "ai."


Shouldn't be much different. In either case you have no idea what the external person/AI will do with your data.


Yes and I think that works with many areas and in many products.

But when you have confidential information in the mix - especially stuff that might have SEC implications - it changes the game.

A few years ago, I did a contract with a company that had a system that deleted all email greater than a year old. While the official answer was that it was "to save space and improve network performance" I suspect the unstated reason was to prevent fishing expeditions.

If your email is being cc'd outside the org and read by actual humans, that introduces some awkward problems.. and may force people to admit the actual reason for the policy. ;)


Interesting, and probably very true.

Most email deletion policies are to protect companies from after-the-fact lawsuits. :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: