Google, which has an ambitious plan to address climate change with cleaner operations, came nowhere close to its goals last year, according to the company’s annual Environmental Report Tuesday.
Imagine paying an unholy amount of money to actively fuck everyone over, including yourself, especially yourself like holy shit…
What is it with big companies and desperately wanting to go under? It’s like they’re taking “too big to fail” as a challenges, the only rule being you can’t just shutdown or sell.
Quarterly finances kinda answer that. Jumping onto the AI bubble brings investors, makes your company highly valued and gives managers fat quarterly and annual bonuses. It doesn’t matter if the company or whole industry goes under in the future, because those bonuses have already been collected.
This has always been true but somehow seems to have sped up the past few years. There’s so little concern for longevity or making a quality product. Yeah, it’s a flaw in capitalism but I’m wondering why it took so long to surface.
There’s a term used in tech called “empire building”. It’s where managers and execs promote their little slice of the company to persevere and grow their own career. At a certain level, it leads to someone that leads a division like AI having enough influence that they can say “let’s put AI into search”.
The sad thing about tech is that at a certain level, an executive rises above the customer in dictating what is best for a product. Data and stats can tell you whatever story you want to promote, so at Google HQ they’re probably worried about the negative press, but they’re looking at “successful” numbers of questions answered by AI and are patting themselves on the back. Both search and AI execs look good because they delivered something, and they’ll likely get a nice bump from their bosses in terms of rep.
The thing with empires is that they fall. Not overnight, and maybe not with the same emperor, but they do fall.
Data and stats can tell you whatever story you want to promote
Seen this so many times at my work. There’s some bone-headed decision and the people in charge are like “look guys we ran the numbers”. But the methodology is messed up somehow, or they just ignored / misinterpreted the numbers while pretending they were following the data, or it doesn’t bear out in the real world; etc.
When data and common sense disagree you’d better be damn sure in the data.
Imagine paying an unholy amount of money to actively fuck everyone over, including yourself, especially yourself like holy shit…
What is it with big companies and desperately wanting to go under? It’s like they’re taking “too big to fail” as a challenges, the only rule being you can’t just shutdown or sell.
Quarterly finances kinda answer that. Jumping onto the AI bubble brings investors, makes your company highly valued and gives managers fat quarterly and annual bonuses. It doesn’t matter if the company or whole industry goes under in the future, because those bonuses have already been collected.
This has always been true but somehow seems to have sped up the past few years. There’s so little concern for longevity or making a quality product. Yeah, it’s a flaw in capitalism but I’m wondering why it took so long to surface.
I can kinda answer that.
There’s a term used in tech called “empire building”. It’s where managers and execs promote their little slice of the company to persevere and grow their own career. At a certain level, it leads to someone that leads a division like AI having enough influence that they can say “let’s put AI into search”.
The sad thing about tech is that at a certain level, an executive rises above the customer in dictating what is best for a product. Data and stats can tell you whatever story you want to promote, so at Google HQ they’re probably worried about the negative press, but they’re looking at “successful” numbers of questions answered by AI and are patting themselves on the back. Both search and AI execs look good because they delivered something, and they’ll likely get a nice bump from their bosses in terms of rep.
The thing with empires is that they fall. Not overnight, and maybe not with the same emperor, but they do fall.
Seen this so many times at my work. There’s some bone-headed decision and the people in charge are like “look guys we ran the numbers”. But the methodology is messed up somehow, or they just ignored / misinterpreted the numbers while pretending they were following the data, or it doesn’t bear out in the real world; etc.
When data and common sense disagree you’d better be damn sure in the data.
It’s called “data chauffered”: instead of following the data, tell it where it needs to take you
driven data design