This might be a less popular opinion on a site like HN, but I'm of the opinion that CEO's don't do a whole lot.
Maybe at small startups they are more involved, but the larger the company, the less I think that CEOs or other C-Suite types actually do.
While I also think ChatGPT is over-hyped and largeley incorrect in what it says, I would answer your question with a "yes". ChatGPT is perfectly capable of writing/delivering speeches at MS Build or whatever.
> but I'm of the opinion that CEO's don't do a whole lot.
Rather, you just don't understand what CEOs in large public companies actually do. You're comparing them to earlier stage CEOs, who can be more hands on.
When running a public company of a quarter million people, the CEO's responsibility starts to look more like an asset manager responsible for a $4 trillion dollar book.
And no - nobody wants that role replaced by an LLM.
Just invest in a reasonably diverse index fund ( or a few). This is actually the optimal drama free way to go for most.
In the long run nobody out performs consistently anyway. We all get hit due to market events.
You may be giving CEOs much more credit than is due. And for all that they actually do, the outperformer is a rarity not the norm.
LLM could certainly fit this. Particularly when trained with all the MBA nonsense education in the world. It wouldn’t be the end of the world and it wouldn’t be substantially better/worse. But it would be cheaper.
I'm sorry, I must be missing something. Which companies make up the index funds if (most) CEOs liquidated their companies and invested in index funds? And how would they liquidate at anything close to their valuation without being priced based on their future expectations?
I don’t think they meant it literally. They were responding to the comment that their job was “like” managing a portfolio of investments. And in that respect the strategy of diversifying “like” with an index fund seemingly appealed to the commenter.
Jeff Bezos once talked about how his job depends on the quality of his decisions. He might not make very many decisions, but they are very high impact. Delegating this high-impact decision making to AI, which often makes random low-quality decisions, sounds like a bad idea.
The CEO also needs to sell those decisions to the organization to get buy-in on the vision and carry it out. How inspired will the organization be by the direction of an AI? No one in an organization will care about this stuff more than the CEO (if they are decent). An AI can’t care, so why would anyone else?
An AI may be making an ultimately random choice (prove the CEO isn't) but it's actual options are weighted on statistical grounds from much wider sources of data than a human can knowingly handle.
I say knowingly because actually the sum total of accumulated info for just a month or two of human activity eclipses even today's LLM.
And the CEO decisions are frequently flawed because there's a strong filter of the information from below.
Perhaps a crowd sourced (employee sourced) decision making process would be best with the wisdom of crowds.
The main responsibility of the CEO of a large company is to set the company's culture and make top-level decisions about what the company does and does not choose to do. Depending in the company they may bring relationships with executives, investors and experts inside and outside the company.
Everything else is delegated to lower level executives and staff.
Probably by design? A simple prompt and some tools?
Bro I have to continuously prompt and plan even to get a reasonably sane piece of code written w/ Claude. I’m not surprised this failed. Its like they wanted it to.
I doubt it was by design. It's still an autocomplete machine in the end. Claude tries to do agents better than anyone and part of this was to see where it would fail.
There's plenty of blind spots in a real world situation. It was trained to answer questions and score well in exams, not to negotiate and evaluate the value of a tungsten cube, and so on. There weren't enough scams in the training set. And seeing how rare data is from a CEO's perspective, I expect it would perform much worse.
It's not just vending machines, it does poorly in games with full wikis too.
The role of CEOs is to receive praise for every achievement, but also blame when things go wrong. It's a high risk, high reward position that most people can't handle.
Since it's not possible to blame AI, no ChatGPT can't be used to replace the CEO of Microsoft.
This. The problem lies with blame attribution. We humans have been accustomed to think everything from that perspective. As Yoshua Bengio points out in Machine Learning Street Talk podcast: "social system hasn't evolved to keep up with the technology".
To even ask this question shows no understanding of what a CEO of a large corporation does.
1) Investor relations: Yes an AI could answer questions about the financials on the quarterly earnings call. With a lot of careful handling you might maybe even get it to do this in a way that it wouldn’t hallucinate and lead to shareholder lawsuits and SEC enforcement action. But would it go out for lunch with a bunch of fund managers and convince them to keep their investment when you’ve had a bad quarter? No. Would it take a 3-hour meeting with investment bankers to talk through how to recapitalize or refinance debt? No. If you’re talking a startup CEO is it going to convince a16z, SVF, etc to invest? Nope. If you think it will you’ve never done any of those things. It’s trying to capture lightning in a bottle- not a process you can automate. For example, how did Adam Neumann convince Steve Cohen to invest in WeWork? (I have this on the authority of a friend who was one of about 10 people there) He showed up late, and drunk for a small dinner at Steve Cohen’s apartment and got everyone there doing tequila shots. This is not something an AI is capable of.
2) Corporate strategy. Yes you can get it to generate meaningless gibberish on powerpoint, so you might think corporate strategy would be covered, but consider just a few of the big calls Satya Nadella has made (off the top of my head) in the last couple of years. 1) betting big on OpenAI 2) Intervening to save Sam Altman’s ass after he was fired 3) basically giving up on the “console wars”, ceding the hardware victory to Sony, shutting down a bunch of game studios telling the world how tough the climate is even though you’ve just had the most profitable years in your history, betting big on gamePass etc? Gonna do that? No it won’t.
3) Managing the top executive team. Is it going to do this? Of course not.
Not saying an AI couldn’t possibly do a much better job than current CEOs in some hypothetical world where CEOs do different things from what they do now, but in our world, there is literally none of the black magic bs that CEOs pull to get corporations to be worth obscene bucks that an AI could do.
> 3) basically giving up on the “console wars”, ceding the hardware victory to Sony, shutting down a bunch of game studios telling the world how tough the climate is even though you’ve just had the most profitable years in your history, betting big on gamePass etc?
I’ve only seen this referred to as a strategic failure. You seem to be declaring otherwise. What’s the upside for Microsoft?
I'm not saying any of those decisions was necessarily good. They're just the kinds of things CEOs do. I personally think Microsoft dropped the ball spectacularly on this, from the perspective of the consumer, and objectively choice is less and the outcomes are worse, and gamers and game devs have suffered tremendously as a result. However Microsoft achieved their objective, which was to be able to say that games revenue gross margin went up.
1) It would use tool calls to dispatch other people to do those things.
2) Trained on MBA knowledge, it would know which tools to call and when.
3) This is the only valid point and a weak one at best considering AI girlfriends and people developing AI psychosis due to close relationships with their AI. This part is coming.
"I can replace desk meat like you with an Eliza helix and a white noise generator." (Quoted from memory from Howard Taylor's wonderful Schlock Mercenary web comic.)
Practically, it's far easier to simply ask this at your next all-hands when they are source for questions about AI. To make this a meta: use ChatGPT to ask this as a pointlessly-long-worded question to evade HR/marketing from filtering the question until it's too late ;)
This might be a less popular opinion on a site like HN, but I'm of the opinion that CEO's don't do a whole lot.
Maybe at small startups they are more involved, but the larger the company, the less I think that CEOs or other C-Suite types actually do.
While I also think ChatGPT is over-hyped and largeley incorrect in what it says, I would answer your question with a "yes". ChatGPT is perfectly capable of writing/delivering speeches at MS Build or whatever.
> but I'm of the opinion that CEO's don't do a whole lot.
Rather, you just don't understand what CEOs in large public companies actually do. You're comparing them to earlier stage CEOs, who can be more hands on.
When running a public company of a quarter million people, the CEO's responsibility starts to look more like an asset manager responsible for a $4 trillion dollar book.
And no - nobody wants that role replaced by an LLM.
Just invest in a reasonably diverse index fund ( or a few). This is actually the optimal drama free way to go for most.
In the long run nobody out performs consistently anyway. We all get hit due to market events.
You may be giving CEOs much more credit than is due. And for all that they actually do, the outperformer is a rarity not the norm.
LLM could certainly fit this. Particularly when trained with all the MBA nonsense education in the world. It wouldn’t be the end of the world and it wouldn’t be substantially better/worse. But it would be cheaper.
I'm sorry, I must be missing something. Which companies make up the index funds if (most) CEOs liquidated their companies and invested in index funds? And how would they liquidate at anything close to their valuation without being priced based on their future expectations?
I don’t think they meant it literally. They were responding to the comment that their job was “like” managing a portfolio of investments. And in that respect the strategy of diversifying “like” with an index fund seemingly appealed to the commenter.
Jeff Bezos once talked about how his job depends on the quality of his decisions. He might not make very many decisions, but they are very high impact. Delegating this high-impact decision making to AI, which often makes random low-quality decisions, sounds like a bad idea.
The CEO also needs to sell those decisions to the organization to get buy-in on the vision and carry it out. How inspired will the organization be by the direction of an AI? No one in an organization will care about this stuff more than the CEO (if they are decent). An AI can’t care, so why would anyone else?
An AI may be making an ultimately random choice (prove the CEO isn't) but it's actual options are weighted on statistical grounds from much wider sources of data than a human can knowingly handle.
I say knowingly because actually the sum total of accumulated info for just a month or two of human activity eclipses even today's LLM.
And the CEO decisions are frequently flawed because there's a strong filter of the information from below.
Perhaps a crowd sourced (employee sourced) decision making process would be best with the wisdom of crowds.
But his decision quality is why Amazon is full of fake and low quality garbage these days, isn’t it?
Maybe, but Bezos hasn’t been CEO in over 4 years. Amazon has seemed to double down on the race to the bottom under Jassy.
The main responsibility of the CEO of a large company is to set the company's culture and make top-level decisions about what the company does and does not choose to do. Depending in the company they may bring relationships with executives, investors and experts inside and outside the company.
Everything else is delegated to lower level executives and staff.
> the larger the company, the less I think that CEOs or other C-Suite types actually do
What, specifically do you think they do?
Yes. AI is actually perfectly suited for the CEO role, as it's a big generalizer. The biggest generalist in a company is the CEO.
I have no idea why so many companies are focusing on replacing entry level work with AI; the real alpha is at the top of the org chart.
> I have no idea
I have an idea. Turkeys don't vote for Christmas.
What if it's positioned as a "thought partner" for CEOs, and eventually the board asks "hey, do we really need the human?"
It's still not good enough to manage a vending machine: https://www.anthropic.com/research/project-vend-1
Probably by design? A simple prompt and some tools?
Bro I have to continuously prompt and plan even to get a reasonably sane piece of code written w/ Claude. I’m not surprised this failed. Its like they wanted it to.
I doubt it was by design. It's still an autocomplete machine in the end. Claude tries to do agents better than anyone and part of this was to see where it would fail.
There's plenty of blind spots in a real world situation. It was trained to answer questions and score well in exams, not to negotiate and evaluate the value of a tungsten cube, and so on. There weren't enough scams in the training set. And seeing how rare data is from a CEO's perspective, I expect it would perform much worse.
It's not just vending machines, it does poorly in games with full wikis too.
The role of CEOs is to receive praise for every achievement, but also blame when things go wrong. It's a high risk, high reward position that most people can't handle.
Since it's not possible to blame AI, no ChatGPT can't be used to replace the CEO of Microsoft.
This. The problem lies with blame attribution. We humans have been accustomed to think everything from that perspective. As Yoshua Bengio points out in Machine Learning Street Talk podcast: "social system hasn't evolved to keep up with the technology".
To even ask this question shows no understanding of what a CEO of a large corporation does.
1) Investor relations: Yes an AI could answer questions about the financials on the quarterly earnings call. With a lot of careful handling you might maybe even get it to do this in a way that it wouldn’t hallucinate and lead to shareholder lawsuits and SEC enforcement action. But would it go out for lunch with a bunch of fund managers and convince them to keep their investment when you’ve had a bad quarter? No. Would it take a 3-hour meeting with investment bankers to talk through how to recapitalize or refinance debt? No. If you’re talking a startup CEO is it going to convince a16z, SVF, etc to invest? Nope. If you think it will you’ve never done any of those things. It’s trying to capture lightning in a bottle- not a process you can automate. For example, how did Adam Neumann convince Steve Cohen to invest in WeWork? (I have this on the authority of a friend who was one of about 10 people there) He showed up late, and drunk for a small dinner at Steve Cohen’s apartment and got everyone there doing tequila shots. This is not something an AI is capable of.
2) Corporate strategy. Yes you can get it to generate meaningless gibberish on powerpoint, so you might think corporate strategy would be covered, but consider just a few of the big calls Satya Nadella has made (off the top of my head) in the last couple of years. 1) betting big on OpenAI 2) Intervening to save Sam Altman’s ass after he was fired 3) basically giving up on the “console wars”, ceding the hardware victory to Sony, shutting down a bunch of game studios telling the world how tough the climate is even though you’ve just had the most profitable years in your history, betting big on gamePass etc? Gonna do that? No it won’t.
3) Managing the top executive team. Is it going to do this? Of course not.
Not saying an AI couldn’t possibly do a much better job than current CEOs in some hypothetical world where CEOs do different things from what they do now, but in our world, there is literally none of the black magic bs that CEOs pull to get corporations to be worth obscene bucks that an AI could do.
> 3) basically giving up on the “console wars”, ceding the hardware victory to Sony, shutting down a bunch of game studios telling the world how tough the climate is even though you’ve just had the most profitable years in your history, betting big on gamePass etc?
I’ve only seen this referred to as a strategic failure. You seem to be declaring otherwise. What’s the upside for Microsoft?
I'm not saying any of those decisions was necessarily good. They're just the kinds of things CEOs do. I personally think Microsoft dropped the ball spectacularly on this, from the perspective of the consumer, and objectively choice is less and the outcomes are worse, and gamers and game devs have suffered tremendously as a result. However Microsoft achieved their objective, which was to be able to say that games revenue gross margin went up.
1) It would use tool calls to dispatch other people to do those things.
2) Trained on MBA knowledge, it would know which tools to call and when.
3) This is the only valid point and a weak one at best considering AI girlfriends and people developing AI psychosis due to close relationships with their AI. This part is coming.
Top executive team could bs-it the CEO using LLM injection in their reports, so no.
I remember reading an article that concluded "if someone were to be replaced, it would be the shot callers"
So yes, in theory.
My first thought was"but an AI is not responsible", but how responsible is a CEO? They seem to get away with a lot. I could be wrong though
No because it would be too well informed to make biased (company focused/favouring) decisions. (Joking)
But it does leave me wondering how would we know if it hasn't already happened in some company?
Sure, they could. Would it be effective, probably not.
"I can replace desk meat like you with an Eliza helix and a white noise generator." (Quoted from memory from Howard Taylor's wonderful Schlock Mercenary web comic.)
You have been marketed. Congrats.
Practically, it's far easier to simply ask this at your next all-hands when they are source for questions about AI. To make this a meta: use ChatGPT to ask this as a pointlessly-long-worded question to evade HR/marketing from filtering the question until it's too late ;)
Clippy for CEO!
No