Everyone wants to pin this on the Microsoft acquisition or incompetence but it seems pretty clear to me from the material
GitHub has posted that AI has 10xed the amount of code being committed to GH, which has downstream effects everywhere - CI, Actions, code ingestion, everywhere. The author pins it on weird things like MS Copilot, which kind of feels like he’s listing off things he doesn’t like rather than casual favors. This is ignoring the 800 pound gorilla in the room.
The graph in TFA shows the downtime pattern starting in January 2020. OpenAI released GPT-3.5 in November 2022 (basically December), and LLM/agentic coding didn’t really kick off in the way you’re describing until 2024, but really in 2025.
How can that explain the terrible uptime for the ~4 years post acquisition before all the AI stuff you’re talking about started?
Here's GitHub's historical uptime graph (on which this chart is based), saying there was no recorded downtime that day, or in fact that entire month: https://www.githubstatus.com/uptime?page=40
We don't have enough data to confirm if it's over or under reporting. This sample size of 1 is enough to prove the data is not perfectly accurate, but it's not enough to prove a skew bias in the data either way.
I am making an assumption that if Microsoft saw a lot of false positive outages they would fix that, but might drag their feet if there was an outage that didn't get properly recorded (assuming it's automatic to begin with, it might be that a human needs remember to update it).
Just dumping HARs from devtools from a status site that hallucinates 100% uptime when it has no data. For example, all GitHub services had 100% uptime in June 1996: https://www.githubstatus.com/uptime?page=200
The graph gives GitHub Actions 100% uptime before it launched to GA in November 2019. That factors into the average uptime for every month on the graph before that. It's fully horseshit.
The subjective experience I and others report is that GitHub feels to have gotten significantly worse over the last few months. If you look at the month over month view of "Uptime history" in the cited link[1], it confirms this: it's been sub-90 (even sub-80 last month) essentially since the start of this year (i.e. when GitHub says that commit activity 10xed). Go back even a year and it's all in the high 9s.
I honestly can't explain the discrepancy between the graph in the article and the month over month stats on the same page, but the latter tracks both to my own subjective experience of GitHub and their own internal metrics.
Yeah, I had the exact same response after reading the post. I mean, I'm all for jumping on the Microsoft hate train, but not if it misses the elephant in the room. Let's say the _perfect_ GitHub replacement spawns tomorrow? What's preventing the same infrastructure challenges of millions of lines of AI-generated code destroying it?
I think centralized code hosting is pretty much going to get killed by AI. Just like it's doing to social media.
> I mean, I'm all for jumping on the Microsoft hate train, but not if it misses the elephant in the room.
That elephant didn’t even exist yet for the first few years of poor uptime shown in the graph in TFA… I don’t really disagree if we’re talking about the recent uptime issues, but how does that explain the years 2020-2023?
It doesn't. It just means if they were having problems before, they've now been made significantly worse by AI (on the free tier). All I'm saying is that the problem is bigger than, "Microsoft sucks."
> I think centralized code hosting is pretty much going to get killed by AI. Just like it's doing to social media.
Private corporate codebases are a poor fit for GH because they don't benefit from public social graph effects. And the typical codebase isn't so large as to be technically challenging to deal with with OSS tools. I'd guess they make up a substantial share of revenue.
But once the reliability is called into question, self-hosted or smaller alternatives start to look good. Although there's some trickiness there if you want to be super cautious about making sure you can get to your code+infra in case of a vendor incident, especially if you're cloud based.
Because if you were building GitHub from scratch today you wouldn't build it the same way and would benefit from many of the technological advancements of the last 2 decades (nearly).
>What's preventing the same infrastructure challenges of millions of lines of AI-generated code destroying it?
There's something called "rate limits" that engineers not working for GitHub have probably heard of; it's this crazy idea that you should limit the load on your infra in order to avoid downtime. GitHub is not the first free service to ever have to deal with bots.
of all the awful things AI is doing and will be doing to society, killing centralized code hosting and social media will be its shinniest moments, both deserve to die painful deaths
> How did people do it before github? Did everyone write everything with peek and poke?
I've been sharing GPL projects since 1999. We didn't need peek and poke (Both of which I have also used further in history...), but we managed nevertheless.
Prior to github I shared software on sourceforge (and others). Prior to that I published stuff on Freshmeat.
Prior to that I downloaded games others shared (not open source) on Happy Puppy.
Prior to that I used usenet to find and download games, shareware, etc.
Prior to that I used ftp to (IIRC) ftp.sunsite.edu, ftp.nic.fi, and others.
Prior to that I got news of new releases using Gopher.
Finally, prior to that, I actuallyy did use peek and poke to write software :-/
If github went away, and centralised repos went away, we'd still have something...
Why is centralized code hosting getting killed? I'm running an opensource project, >99% of the code is AI generated, could not do this without GitHub. Ai generated source code needs a place where AIs and people can collaborate. I'm expecting GitHub to be hugely successful, but mostly for an AI audience.
I'm sure the underlying infra is not a single server, so this is mostly a period where they have to adapt to higher loads due to AI becoming actually useable in the last 8 months. It's basically proof how well AI works these days. Give it a few months so they can scale and it'll get better. Remember Twitter fail whale? Growth pains that can and will be solved.
> It's basically proof how well AI works these days. Give it a few months so they can scale and it'll get better. Remember Twitter fail whale? Growth pains that can and will be solved.
GitHub's problems can technically be solved, but that doesn't mean they can be solved in a way where the economics still work out.
If AI use is 10x-ing the amount of infrastructure costs for GitHub but not 10x-ing the amount of money Microsoft brings in from GitHub then there is certainly no guarantee they will bother to solve these issues adequately.
And I'd be shocked if the revenue side of things isn't lagging way behind the extra usage post-AI-era, both because a lot of the new use is probably on the GitHub free tier, and because even on the paid tier most usage (other than CI/Actions, AFAIK) are on a fixed subscription cost per user regardless of how much you are slamming their servers and it is unclear how much they can raise that price without current enterprise users fleeing.
Twitter had a clearer goal that aligned with the financials... support more people stably, show more ads. Things are less clear with GitHub's business model where the free tier is a loss leader for the paid tier but the expansion in usage is likely to balloon the free tier usage at a far faster rate than the paid tier usage.
Also (and this part is admittedly far more speculative) if AI labs are to be believed this is still early days for AI usage and we'll still see massive usage growth over the next few years. If GitHub is already having existential trouble at the beginning of the curve, what hope do they have to scale up with their current business model if AI usage actually does ramp up exponentially?
> And I'd be shocked if the revenue side of things isn't lagging way behind the extra usage post-AI-era, both because a lot of the new use is probably on the GitHub free tier, and because even on the paid tier most usage (other than CI/Actions, AFAIK) are on a fixed subscription cost per user regardless of how much you are slamming their servers and it is unclear how much they can raise that price without current enterprise users fleeing.
I'd guess most of the costs incurred to GitHub outside of Actions as part of the enterprise flat-rate tier are a fraction of what enterprises are paying for AI in order to incur those costs in the first place.
If a company has to pay $5 extra to GitHub for every $100 of extra AI spend due to that AI use creating disproportionate load, I've got a hard time imaging that GitHub will be the thing that gets fled from.
As far as the free tier goes, it seems like there should be a path to making prohibitively-cost-incurring usage models high-friction. (e.g. limit the free Actions minutes that you get to a certain number per month.) As long as the limits are roughly proportional to the actual costs incurred, there's not too much risk of people fleeing to a competing service, because the only way a competing service would be able to undercut the costs is by taking steep losses themselves, which isn't much of a business model in order to attract people's code repositories.
Yah, the monitization bit is challanging. I'll ask my agent to click some of the ads GitHub serves it ;-)
But getting this infrastructure right is crucial for a future where most of the code is AI generated. GitHub puts microsoft in a good position to experiment and learn how to optimize GitHub (enterprise) for the future.
Nate b Jones on youtube, https://youtu.be/FDkvRl1RlT0?si=AEYlUchm_oalMSzf, argues that Atlassian might be an interesting acquisition for Anthropic, as it provide most of the context AI at enterprises need. When executed well, GitHub enterprise, can offer microsoft the same value: the context AI needs in the future.
> But getting this infrastructure right is crucial for a future where most of the code is AI generated.
That's not the problem. The revenue model they have is based on a certain amount of usage from the people who do not pay (you, for example), and a certain amount of usage from the people who do pay (enterprises).
If you 100x you usage, then they need 100x the infra, which means they need 100x the revenue.
At that sort of usage enterprises would rather self-host, and github would be left with only the free users, who are almost all like you now - hammering their servers but not paying for it.
If you self-host, for $5/m you can have your own VPS, but doesn't really solve the problem as much as you'd think - those are all vCPUs and shared, so you can't hammer them all the time either because then the provider has to increase their infra as well so fewer accounts share a single CPU.
Either way, if you want to generate code with AI at the speed that an agent can, you'll have to pay for it one way or another.
Also, one thing the numbers they published show is that the bits that are growing 10x YoY (and which they expect to get “worse”) are all the things that you get “unlimited” mileage off (even if you're a paying customer): repos, commits, PRs.
Things that have “usage based billing” (like action monites) grow closer to 2x YoY.
When there's a dollar amount attached, people don't 10x, because it's not worth it. They splurge when it's cheap, and unlimited.
Well either Microsoft finds a way, or Anthropic will. I'm sure they'd love to host all these projects with all the source and context. Maybe they should buy GitLab, or Atlassian.
> But getting this infrastructure right is crucial for a future where most of the code is AI generated.
If that is the future, then source code hosting will be the least of our worries. The entire industry will collapse because the software will stop working.
> Ai generated source code needs a place where AIs and people can collaborate. I'm expecting GitHub to be hugely successful, but mostly for an AI audience.
Are you paying them in proportion to the resources they expend on you?
There's this thing called "sustainability", and every company needs to have it. Github cannot continue on the current trajectory where every AI-bro wants to run an agent that generates 1000s of lines of code per hour, dozens of commits per hour... and provide that for free to a few dozens of millions of users who won't pay.
That being said, Microsoft does have an opportunity here - AI-bros are willing to pay $200/m to burn tokens so Github should offer a plan for Copilot, say $400/m, that includes a repo.
If they don't ban AI agents on free tiers, they are going to be out of business soon.
3 months post Microsoft acquisition, GitHub expanded the free plan to include unlimited private repos.
The next year they removed the limitation on collaborators on private repos for free users.
In the last 4 years they’ve significantly improved their project management tools. I think a lot of teams can make do with GitHub Projects, they’re pretty decent.
Who knows if any of these are directly because of Microsoft or not. But there has naturally been material improvements to GitHub in the years after being bought by Microsoft.
> GitHub hasn't changed in any positive way since the acquisition.
It's more like any positive actions they have had are being outright dismissed or forgotten. They removed several restrictions that Github had over private accounts, as well as github actions. Aside from the downtimes, the Github of today is fantastic compared to pre-acquisition Github.
I'm loving it, running an opensource project mostly AI generated, i don't have to think about version control, building and testing my app, running AI code review, hosting my docs website, API and cli to enable Claude Code to interact with everything, etc.
It provides huge value for anyone running an opensource AI generated project.
I like to think that Microsoft is trying to run GitHub in Windows in their Azure cloud. And on the fact that every time GitHub is down I think of "someone updated the Windows Servers GH runs on and had to reboot everything".
While I'm 99% sure it is not true, it makes me sleep better at night. And giggle a little when it goes down.
Even if this is true: Microsoft own an entire cloud platform. They have enormous codebases of their own and they employ ~200k people. It’s just not an excuse, especially because they consciously made decisions such as e.g. private repositories being free
A big part of the problem IS Microsoft acquisition. They forced them to move to Azure, which is terrible.
Around 8 years ago I was working for a company that they also acquired, and they also forced us to move to Azure. Performance was terrible and our system wasn’t just working there as it should. A few years later our service was dead and all customers moved to one of their office products.
Totally agree. People’re saying Microsoft this, Microsoft that with their Microsoft hate, but they ignore the fact that AI trend making GitHub worse, and GitHub is trying to fix.
GitHub has been basically the default for free public git hosting for a long time. I was curious what bitbucket has and it looks like the free tier is so limited, I can't imagine a lot of people hosting vibe coded open source there.
Don't you think Microsoft ought to have thought a bit more about scale? They're not just innocent bystanders here. GitHub Copilot is a first class citizen of GitHub and so of course a lot of private enterprises are going to be using the thing that's bundled with the other thing.
"Yes, it (AI) will kill open source—at least as we know it.
I’m convinced that GitHub and GitLab will eventually stop offering their services for free if the flood of low-quality, "vibe-coded" projects—complete with lengthy but shallow documentation—continues to grow at the current rate."
I’m with you here. Further: Even though I disagree with it, “GitHub down, Microsoft bad” is a defensible take, but we’ve seen it ad nauseam at this point.
This would make sense if GitHub themselves cited increased traffic or load shedding as their root cause, but most of their incidents from the last month seems to cite misconfigured infrastructure or operational mistakes.
For upstarts, individuals, artists and idealists, Github was a means to reach and distribute code reliably to a large number of people on the planet. Is that true today? Will it ever?
97% of code coming in is AI slop. It's owned by an evil, rent seeking corp. Reliability is a flaming dumpster fire. And everything you commit there will be used to train more AI.
- Microsoft committed to AI.
- AI slop is increasing the costs for maintaining/running GitHub.
- GitHub is sinking.
This is interconnected. I can think of numerous other ways how this would be handled. But Microsoft went the AI slop way already. There is no way back for them.
I went to look at a repo on Github today. Clicked on the "xxx commits" link to see the commit history, and got told I've hit a secondary rate limit and need to wait.
I'm the only person on this network that would even look at Github, and my connection has a dedicated IP, no CGN.
Wouldn't this break Go and other build systems (npm?) that pull packages from github by default? Not that I endorse the practise, but will Microsoft really kick out such a big class of users?
Can't count the times a "nuget restore" in our CI fails with 401, just to succeed on a 2nd attempt a few seconds later. Seems like the IP range is somehow flagged, so there's definetly a downside to it.
sadly even that isn't an option for me. i spent half an hour yesterday trying to create a github account. i couldn't. my @proton.me got rejected. captchas take several painful minutes to complete. and even when I did manage to create an account (at least the page displayed a success mesage), it got disabled the instant I logged in for "TOS violation". i wish i was joking, but i literally cannot create a github account. a few years ago this would have seemed crazy. but here we are.
i'm stuck having to use google (another pain in the ass) for discovering codebases that contain specific snippets. but some repo contents (such as wikis) are not exposed at all to search engines
Yeah this is just typical techbro gaslighting. There is no rate-limit and hasn't been for years (it's just default deny), but they refuse to change the wording to reflect.
The parent's experience which mirrors my own - on a clean residential IP that hasn't sent any traffic I hit that "rate-limit" on my first request to the commits list view.
So there is no rate-limit, it's a default deny for unauthenticated requests... which could be fine but at least update the error message to reflect that.
“GitLab - enterprise grade, meaning it’s bloated and confusing but it’ll impress your boss. This could be the choice if you need multiple meetings to make the choice.“
We use gitlab at work, and I have to say that it is disappointing. The UI is so complicated to do the simplest things (e.g. to approve a MR you need to click a button that is actually a menu; the diffs are difficult to read; the 'To-do list' includes MRs that were already merged (how is that actionable?)) and it seems that they're struggling to turn around improvements quickly. The issue around the 'To-do list' including MRs that were already merged was raised years ago.
I also have to say that I'm surprised about the backlash against bitbucket. I find the UI incredibly simple and clear, as do all of the new joiners. With Script Runner you can do some pretty amazing things. It handles the huge repo's well too.
> to approve a MR you need to click a button that is actually a menu
It's not really any better on Github. Why do I have to click on "Files changed" to approve a PR? Overall I would say it is on par with Github. Worse in some ways; better in others.
Funny but not really true. It's not really any more bloated or confusing than Github. It's not really "enterprise grade" software. If you want that look at Jira or anything Microsoft produces.
For $5 a month I can host a server and put a bunch of projects on there. Yeah, I don't have a million stars on my repos but it works for what I need and I can give access to whoever I want.
It sort of feels like no major open source repository can be possibly left well enough alone. I remember how SourceForge went down the drain, it's a real pity to see same happen with GH.
Side note: I read the URL as "dBus hell". We've all been there m8
I often think about how I’d do it if I ran my own company.
I would really like to see what it would be like doing all code reviews over email. The repo would just be a simple vps-style server with git-only ssh access, there’d be a particular for-review/ branch namespace for code to be reviewed, and CI would just be a bot waiting for branches to show up and would mark refs as good or not by just annotating/tagging them. It could reply in the email thread with results too.
The mailing list would have a web archive viewer, naturally. That’s how you could look at old reviews. There’s tons of existing solutions for this, and it’s just html.
Chat would be on IRC with bots to archive the channels. Easy as hell.
The whole thing (except maybe the CI runners which need beefier hardware) could be done on a very cheap server.
GitHub is waaay over engineered for what you need to run a software project. Look at the Linux kernel, they just use a simple mailing list, and it’s debatably the most successful software project of all time.
Issue/bug tracking is scarier though. Because I’d probably want to yak shave my own solution and get too involved with that and not even focus on what the company does. Maybe it could be a bug tracking software company?
Ideally I would want the code review to be versioned as well with easily accessible history. That is, I would like to see the exact lines which a comment pertains to and when they were changed and switch back and forth. While e-mail is probably good enough as a protocol to exchange this data, the e-mail client is not a good way to view it in my opinion. Maybe we need a decentralised code review system as well.
Github is struggling because AI-boosted coding increased the number of commits 14x in the past year, and the pace is still accelerating. The site is struggling to keep up.
Platform activity is surging. There were 1 billion commits in 2025. Now, it's 275 million per week, on pace for 14 billion this year if growth remains linear (spoiler: it won't.)
GitHub Actions has grown from 500M minutes/week in 2023 to 1B minutes/week in 2025, and now 2.1B minutes so far this week.
So we're pushing incredibly hard on more CPUs, scaling services, and strengthening GitHub’s core features.
So, what's the actual real alternative ? The one that also supports open source projects ? Ironically gitlab is costlier than github, and not without their faults, but that's "maybe" the only other alternative here, anything else ?
I installed forgejo on my home server and never looked back. The only problem I face is when hosting an app on DigitalOcean App platform, or vercel etc. They only connect to GitHub.
> DigitalOcean App platform[…] only connect to GitHub
They also support deployments from GitLab (so long as you're using the gitlab.com-hosted instance and not a self-hosted GitLab instance). If you've deployed your own self-hosted forge, then you can connect DigitalOcean App Platform to it by using gitlab.com as a bridge—register an account on gitlab.com once and instruct your self-hosted forge to replicate copies to gitlab.com. You don't really need to actually use GitLab.
Having said that, considering that DigitalOcean is in the business of selling IaaS/PaaS, it's loony that they don't let you connect to, say, your own self-hosted Forgejo running on their infrastructure…
(Indeed, considering how many people would like to self-host their own forge but how few people want to actually set up and do admin for it, it's loony that DigitalOcean doesn't pick up, say, Forgejo and/or an alternative and offer a sharply discounted (e.g. $20/year) quasi-managed one-click deployment option with first-class support for connecting to their App Platform.)
All of the reasons to avoid GitHub are also reasons to avoid the Digital Ocean App platform and Vercel. I use Digital Ocean, but just the VPSs. Don't let yourself get vendor locked in with these middle men, retain control and shoot for the most universal level of the stack you can.
It's just a step. I will eventually move to coolify, just haven't had time to set it up. But the problem stands: coolify also doesn't connect to forgejo.
Have never developed on/for apple platform, so I have no clue. Apple makes setting up development so hard, I wonder what motivates developers to jump through all the hoops.
> Apple makes setting up development so hard, I wonder what motivates developers to jump through all the hoops.
Money to be made. And they have (had) nice API for most development needs. The actual distribution is a arduous though, mostly around the Review process.
The issue is every AI coding tool integrates as a "GitHub App" (OAuth, PATs, webhooks etc.) first, over other code forges. This load is coming through their 3rd party app integrations. I bet the web/git volume isn't getting smashed as much.
I had to begrudgingly use GitHub over my preference GitLab to use some 3rd party AI features.
The solution for GitHub is to charge or rate-limit some of these 3rd parties integrations and come up with an equitable solution.
I would not be surprised if AI commits are the culprit. There is no way any service would cope with a constant stream of unfettered commits by sleepless always-on agents. Ironically, this same strategy seems to be what GH/MS (and other big companies) are evangelizing - and therefore dying by their own hand (in a way).
Platform activity is surging. There were 1 billion commits in 2025. Now, it's 275 million per week, on pace for 14 billion this year if growth remains linear (spoiler: it won't.)
GitHub Actions has grown from 500M minutes/week in 2023 to 1B minutes/week in 2025, and now 2.1B minutes so far this week.
So we're pushing incredibly hard on more CPUs, scaling services, and strengthening GitHub’s core features.
AI commits are definitely causing the recent issues with their platform as Github has seen an unprecedented amount of traffic since the introduction of OpenClaw. They are likely showing us a future problem that other sites have not experienced yet as the tools have not matured in a way for people to be able to adopt them towards their eventual vision for managing someone entire digital life.
That being said, Microsoft in general in relation to GitHub has shown at least historically they've caused more issues. Outages on Github have become so common place that I genuinely think people have simply gotten used to it. The recent round of these were just bad enough where people felt strong enough to make their own down status page https://mrshu.github.io/github-statuses/. Whether you agree with how they gathered their data, there is something being felt by the community that Microsoft is not being transparent with these issues.
I left Github because of some very strange activity. I had a new folder named "feature" added to one of my repos. At the time I had failed to turn off the AI integrations, so I figured that was the problem.
There is no way I'm going to let a VCS put code into one of my repos without my asking for it or consent. Full stop. I moved all my significant code to codeberg but kept the github account, so my username doesn't get squatted.
> Don't they inject malware/adware into your build artifacts?
Aw, c'mon! The did this for about 4 months which ended in 2015! Prior to that they had Windows installers which did the same, but that also only lasted a few months.
It's now 2026. Exactly what software did you host on sourceforge from 2011 to 2015? Because I hosted my GPL stuff there, and I moved away because I was affected, and yet I am not concerned that they will do that again.
Agree with Gitlab as an enterprise alternative. Beautifully boring and safe to have complex teams and permissions. Also has a good enough Terraform support, and a nice workflow to host docker images
People used GitHub's free infrastructure for over a decade without complaining. Now AI-generated spam and massive amounts of low-quality code are increasing costs everywhere, and suddenly GitHub is the bad guy for acting like a business. Criticizing centralization is fair, but pretending GitHub gave nothing to open source is just dishonest. Alternatives today, are probably going to be flooded by the low-quality AI generated code tomorrow...
There is a self-hosted version of onedev, it's oss.
But there is also a enterprise version (I have one) with very nice plugins most of forge doesn't still have (like the web terminal to debug oui action/ci/cd).
Last week, Robin release a very nice feature for vibe coder. AWESOME
It will be a long way before GitHub dies, but it is definitely sinking. Slop is killing it. I think Microslop, 'xcuse me, Microsoft, realises this too, but there is nothing they can do now that they committed to AI fully. I feel sad for the GitHub engineers, because they write pointless blog entries nobody believes anymore. Meanwhile existing services erode in quality. It's like in a submarine. You have one hole. You manage it. Well, more and more holes pop up the following days. We know where this is headed then ...
I have lost count of how many times something went down on GitHub ever since documenting it on this comment chain [0] and also predicting 6 years ago [1], that going all in and centralizing everything on to GitHub was really not a good idea if you need stability or to push a critical fix and your GitHub actions doesn't work.
Now, are you going to finally self host or should we continue to expect another outage on GitHub?
This time, there is no CEO of GitHub to help us. It is Copilot, and Tay.ai that are still struggling to maintain GitHub.
Everyone wants to pin this on the Microsoft acquisition or incompetence but it seems pretty clear to me from the material GitHub has posted that AI has 10xed the amount of code being committed to GH, which has downstream effects everywhere - CI, Actions, code ingestion, everywhere. The author pins it on weird things like MS Copilot, which kind of feels like he’s listing off things he doesn’t like rather than casual favors. This is ignoring the 800 pound gorilla in the room.
The graph in TFA shows the downtime pattern starting in January 2020. OpenAI released GPT-3.5 in November 2022 (basically December), and LLM/agentic coding didn’t really kick off in the way you’re describing until 2024, but really in 2025.
How can that explain the terrible uptime for the ~4 years post acquisition before all the AI stuff you’re talking about started?
The graph is not accurate, because GitHub's historical downtime data is not accurate.
For example, here is a Hacker News story about GitHub being down on July 28th 2016: https://news.ycombinator.com/item?id=12178449
Here's GitHub's historical uptime graph (on which this chart is based), saying there was no recorded downtime that day, or in fact that entire month: https://www.githubstatus.com/uptime?page=40
GitHub launched a new status page Dec 2018[1]. It doesn't appear as if any history before Oct 2018 was ported over.
[1] https://github.blog/engineering/infrastructure/introducing-t... [2] https://web.archive.org/web/20181211191456/https://www.githu...
Looks like it's not accurate by under repporting not over reporting. So their down time was likely worse!
We don't have enough data to confirm if it's over or under reporting. This sample size of 1 is enough to prove the data is not perfectly accurate, but it's not enough to prove a skew bias in the data either way.
That's fair. We don't know.
I am making an assumption that if Microsoft saw a lot of false positive outages they would fix that, but might drag their feet if there was an outage that didn't get properly recorded (assuming it's automatic to begin with, it might be that a human needs remember to update it).
Oh please, show me a company that has ever over reported their downtime. That's silly.
Or things didn't change much at all except Microsoft forced them to be more honest in their reporting.
See, I can just as easily make up a story that explains the chart.
That graph has bugged me since it went viral. The methodology is horseshit: https://github.com/DaMrNelson/github-historical-uptime
Just dumping HARs from devtools from a status site that hallucinates 100% uptime when it has no data. For example, all GitHub services had 100% uptime in June 1996: https://www.githubstatus.com/uptime?page=200
The graph gives GitHub Actions 100% uptime before it launched to GA in November 2019. That factors into the average uptime for every month on the graph before that. It's fully horseshit.
The subjective experience I and others report is that GitHub feels to have gotten significantly worse over the last few months. If you look at the month over month view of "Uptime history" in the cited link[1], it confirms this: it's been sub-90 (even sub-80 last month) essentially since the start of this year (i.e. when GitHub says that commit activity 10xed). Go back even a year and it's all in the high 9s.
I honestly can't explain the discrepancy between the graph in the article and the month over month stats on the same page, but the latter tracks both to my own subjective experience of GitHub and their own internal metrics.
[1]: https://mrshu.github.io/github-statuses/
I think it's just a case of brain drain, followed by reckless AI adoption which both drove the quality down.
The graph in the article is a lie, because GitHub's "historical data" is a lie.
https://www.githubstatus.com/uptime?page=3000
According to it, GitHub had 100% uptime from June to August 1996.
Yeah, I had the exact same response after reading the post. I mean, I'm all for jumping on the Microsoft hate train, but not if it misses the elephant in the room. Let's say the _perfect_ GitHub replacement spawns tomorrow? What's preventing the same infrastructure challenges of millions of lines of AI-generated code destroying it?
I think centralized code hosting is pretty much going to get killed by AI. Just like it's doing to social media.
> I mean, I'm all for jumping on the Microsoft hate train, but not if it misses the elephant in the room.
That elephant didn’t even exist yet for the first few years of poor uptime shown in the graph in TFA… I don’t really disagree if we’re talking about the recent uptime issues, but how does that explain the years 2020-2023?
It doesn't. It just means if they were having problems before, they've now been made significantly worse by AI (on the free tier). All I'm saying is that the problem is bigger than, "Microsoft sucks."
Saas code hosting seems to be the problem here. If companies self hosted, they could deal with the scaling problems themselves.
> Saas code hosting seems to be the problem here. If companies self hosted, they could deal with the scaling problems themselves.
If all companies did this, there'd be no free tier on Github. You get the free tier because the SaaS customers are subsidising the free tier.
> I think centralized code hosting is pretty much going to get killed by AI. Just like it's doing to social media.
Private corporate codebases are a poor fit for GH because they don't benefit from public social graph effects. And the typical codebase isn't so large as to be technically challenging to deal with with OSS tools. I'd guess they make up a substantial share of revenue.
But once the reliability is called into question, self-hosted or smaller alternatives start to look good. Although there's some trickiness there if you want to be super cautious about making sure you can get to your code+infra in case of a vendor incident, especially if you're cloud based.
Because if you were building GitHub from scratch today you wouldn't build it the same way and would benefit from many of the technological advancements of the last 2 decades (nearly).
I dont even like AI much and this still seem to me like yet another instance of people blaming AI for normal mismanagement and failure.
>What's preventing the same infrastructure challenges of millions of lines of AI-generated code destroying it?
There's something called "rate limits" that engineers not working for GitHub have probably heard of; it's this crazy idea that you should limit the load on your infra in order to avoid downtime. GitHub is not the first free service to ever have to deal with bots.
of all the awful things AI is doing and will be doing to society, killing centralized code hosting and social media will be its shinniest moments, both deserve to die painful deaths
Yes, the terrible sin of ... Hosting code where people can find it
I can’t remember the last time I looked for a project specifically on GitHub. I always come there via a link from another site.
hosting code where people can find it is the reason LLMs can write code, so we kind of screwed ourselves there…
How did people do it before github? Did everyone write everything with peek and poke?
> How did people do it before github? Did everyone write everything with peek and poke?
I've been sharing GPL projects since 1999. We didn't need peek and poke (Both of which I have also used further in history...), but we managed nevertheless.
Prior to github I shared software on sourceforge (and others). Prior to that I published stuff on Freshmeat.
Prior to that I downloaded games others shared (not open source) on Happy Puppy.
Prior to that I used usenet to find and download games, shareware, etc.
Prior to that I used ftp to (IIRC) ftp.sunsite.edu, ftp.nic.fi, and others.
Prior to that I got news of new releases using Gopher.
Finally, prior to that, I actuallyy did use peek and poke to write software :-/
If github went away, and centralised repos went away, we'd still have something...
Private people would keep their code locally and share the snapshot of the code using any file sharing or hosting option available.
Companies had been hosting their own CVS or later svn servers.
Sourceforge
Why is centralized code hosting getting killed? I'm running an opensource project, >99% of the code is AI generated, could not do this without GitHub. Ai generated source code needs a place where AIs and people can collaborate. I'm expecting GitHub to be hugely successful, but mostly for an AI audience.
Because it's centralized. Your project pays the price for every unrelated project that's getting overloaded.
I'm sure the underlying infra is not a single server, so this is mostly a period where they have to adapt to higher loads due to AI becoming actually useable in the last 8 months. It's basically proof how well AI works these days. Give it a few months so they can scale and it'll get better. Remember Twitter fail whale? Growth pains that can and will be solved.
> It's basically proof how well AI works these days. Give it a few months so they can scale and it'll get better. Remember Twitter fail whale? Growth pains that can and will be solved.
GitHub's problems can technically be solved, but that doesn't mean they can be solved in a way where the economics still work out.
If AI use is 10x-ing the amount of infrastructure costs for GitHub but not 10x-ing the amount of money Microsoft brings in from GitHub then there is certainly no guarantee they will bother to solve these issues adequately.
And I'd be shocked if the revenue side of things isn't lagging way behind the extra usage post-AI-era, both because a lot of the new use is probably on the GitHub free tier, and because even on the paid tier most usage (other than CI/Actions, AFAIK) are on a fixed subscription cost per user regardless of how much you are slamming their servers and it is unclear how much they can raise that price without current enterprise users fleeing.
Twitter had a clearer goal that aligned with the financials... support more people stably, show more ads. Things are less clear with GitHub's business model where the free tier is a loss leader for the paid tier but the expansion in usage is likely to balloon the free tier usage at a far faster rate than the paid tier usage.
Also (and this part is admittedly far more speculative) if AI labs are to be believed this is still early days for AI usage and we'll still see massive usage growth over the next few years. If GitHub is already having existential trouble at the beginning of the curve, what hope do they have to scale up with their current business model if AI usage actually does ramp up exponentially?
> And I'd be shocked if the revenue side of things isn't lagging way behind the extra usage post-AI-era, both because a lot of the new use is probably on the GitHub free tier, and because even on the paid tier most usage (other than CI/Actions, AFAIK) are on a fixed subscription cost per user regardless of how much you are slamming their servers and it is unclear how much they can raise that price without current enterprise users fleeing.
I'd guess most of the costs incurred to GitHub outside of Actions as part of the enterprise flat-rate tier are a fraction of what enterprises are paying for AI in order to incur those costs in the first place.
If a company has to pay $5 extra to GitHub for every $100 of extra AI spend due to that AI use creating disproportionate load, I've got a hard time imaging that GitHub will be the thing that gets fled from.
As far as the free tier goes, it seems like there should be a path to making prohibitively-cost-incurring usage models high-friction. (e.g. limit the free Actions minutes that you get to a certain number per month.) As long as the limits are roughly proportional to the actual costs incurred, there's not too much risk of people fleeing to a competing service, because the only way a competing service would be able to undercut the costs is by taking steep losses themselves, which isn't much of a business model in order to attract people's code repositories.
Yah, the monitization bit is challanging. I'll ask my agent to click some of the ads GitHub serves it ;-)
But getting this infrastructure right is crucial for a future where most of the code is AI generated. GitHub puts microsoft in a good position to experiment and learn how to optimize GitHub (enterprise) for the future.
Nate b Jones on youtube, https://youtu.be/FDkvRl1RlT0?si=AEYlUchm_oalMSzf, argues that Atlassian might be an interesting acquisition for Anthropic, as it provide most of the context AI at enterprises need. When executed well, GitHub enterprise, can offer microsoft the same value: the context AI needs in the future.
> But getting this infrastructure right is crucial for a future where most of the code is AI generated.
That's not the problem. The revenue model they have is based on a certain amount of usage from the people who do not pay (you, for example), and a certain amount of usage from the people who do pay (enterprises).
If you 100x you usage, then they need 100x the infra, which means they need 100x the revenue.
At that sort of usage enterprises would rather self-host, and github would be left with only the free users, who are almost all like you now - hammering their servers but not paying for it.
If you self-host, for $5/m you can have your own VPS, but doesn't really solve the problem as much as you'd think - those are all vCPUs and shared, so you can't hammer them all the time either because then the provider has to increase their infra as well so fewer accounts share a single CPU.
Either way, if you want to generate code with AI at the speed that an agent can, you'll have to pay for it one way or another.
Also, one thing the numbers they published show is that the bits that are growing 10x YoY (and which they expect to get “worse”) are all the things that you get “unlimited” mileage off (even if you're a paying customer): repos, commits, PRs.
Things that have “usage based billing” (like action monites) grow closer to 2x YoY.
When there's a dollar amount attached, people don't 10x, because it's not worth it. They splurge when it's cheap, and unlimited.
Well either Microsoft finds a way, or Anthropic will. I'm sure they'd love to host all these projects with all the source and context. Maybe they should buy GitLab, or Atlassian.
> Well either Microsoft finds a way, or Anthropic will.
Just what sort of nonsense is this? Neither of them are going to operate at a loss.
Why are you so convinced that they'd be happy to continue spending money on you and getting none in return?
> But getting this infrastructure right is crucial for a future where most of the code is AI generated.
If that is the future, then source code hosting will be the least of our worries. The entire industry will collapse because the software will stop working.
> Ai generated source code needs a place where AIs and people can collaborate. I'm expecting GitHub to be hugely successful, but mostly for an AI audience.
Are you paying them in proportion to the resources they expend on you?
There's this thing called "sustainability", and every company needs to have it. Github cannot continue on the current trajectory where every AI-bro wants to run an agent that generates 1000s of lines of code per hour, dozens of commits per hour... and provide that for free to a few dozens of millions of users who won't pay.
That being said, Microsoft does have an opportunity here - AI-bros are willing to pay $200/m to burn tokens so Github should offer a plan for Copilot, say $400/m, that includes a repo.
If they don't ban AI agents on free tiers, they are going to be out of business soon.
GitHub hasn't changed in any positive way since the acquisition. A decade is a long time, it tells.
GitHub action, co pilot. Oh and that ugly AI search I'm unable to disable. Migration to azure.
Yes Microsoft managed to ruin the network effect. Outages? The straw that broke the camel's back.
3 months post Microsoft acquisition, GitHub expanded the free plan to include unlimited private repos.
The next year they removed the limitation on collaborators on private repos for free users.
In the last 4 years they’ve significantly improved their project management tools. I think a lot of teams can make do with GitHub Projects, they’re pretty decent.
Who knows if any of these are directly because of Microsoft or not. But there has naturally been material improvements to GitHub in the years after being bought by Microsoft.
> GitHub hasn't changed in any positive way since the acquisition.
It's more like any positive actions they have had are being outright dismissed or forgotten. They removed several restrictions that Github had over private accounts, as well as github actions. Aside from the downtimes, the Github of today is fantastic compared to pre-acquisition Github.
I'm loving it, running an opensource project mostly AI generated, i don't have to think about version control, building and testing my app, running AI code review, hosting my docs website, API and cli to enable Claude Code to interact with everything, etc.
It provides huge value for anyone running an opensource AI generated project.
How on earth is Actions a downside?
I think they meant all the security holes that have been popping up and that there is no interest from Microsoft to fix them.
I like to think that Microsoft is trying to run GitHub in Windows in their Azure cloud. And on the fact that every time GitHub is down I think of "someone updated the Windows Servers GH runs on and had to reboot everything".
While I'm 99% sure it is not true, it makes me sleep better at night. And giggle a little when it goes down.
They definitely do something with Azure. Stuff related to GitHub action runs hosted on something.windows.net, which I believe is azure.
Even if this is true: Microsoft own an entire cloud platform. They have enormous codebases of their own and they employ ~200k people. It’s just not an excuse, especially because they consciously made decisions such as e.g. private repositories being free
A big part of the problem IS Microsoft acquisition. They forced them to move to Azure, which is terrible.
Around 8 years ago I was working for a company that they also acquired, and they also forced us to move to Azure. Performance was terrible and our system wasn’t just working there as it should. A few years later our service was dead and all customers moved to one of their office products.
Totally agree. People’re saying Microsoft this, Microsoft that with their Microsoft hate, but they ignore the fact that AI trend making GitHub worse, and GitHub is trying to fix.
If that's the case, we should also see the exact same pattern on Gitlab, Bitbucket, etc. Do we?
GitHub has been basically the default for free public git hosting for a long time. I was curious what bitbucket has and it looks like the free tier is so limited, I can't imagine a lot of people hosting vibe coded open source there.
10x of nothing is nothing.
What is easier to 10x? A tent or a flat?
10x the code? Easy solution. Throttle unpaid customers or put a quota.
Either way, paid customers should not be affected.
Don't you think Microsoft ought to have thought a bit more about scale? They're not just innocent bystanders here. GitHub Copilot is a first class citizen of GitHub and so of course a lot of private enterprises are going to be using the thing that's bundled with the other thing.
Pray tell where are they going to get memory from?
They're part of the circular AI finance economy, I'm sure they can figure it out.
Yes, I posted the same observation 3 months ago. https://news.ycombinator.com/item?id=46877226
"Yes, it (AI) will kill open source—at least as we know it. I’m convinced that GitHub and GitLab will eventually stop offering their services for free if the flood of low-quality, "vibe-coded" projects—complete with lengthy but shallow documentation—continues to grow at the current rate."
Why have they not simply asked the 800lb gorilla to solve this problem for them?
Gergely's newsletter claims its more like 2.3x.
MS isn't solely to blame for the AI increase, but they are certainly part of the problem, including their integration of copilot into Github.
I’m with you here. Further: Even though I disagree with it, “GitHub down, Microsoft bad” is a defensible take, but we’ve seen it ad nauseam at this point.
This would make sense if GitHub themselves cited increased traffic or load shedding as their root cause, but most of their incidents from the last month seems to cite misconfigured infrastructure or operational mistakes.
The author mentions this and links an article that expands on it
The 800 pound gorilla in the room being a $3T company that also happens to be one of the largest cloud providers?
C'mon.
For upstarts, individuals, artists and idealists, Github was a means to reach and distribute code reliably to a large number of people on the planet. Is that true today? Will it ever?
97% of code coming in is AI slop. It's owned by an evil, rent seeking corp. Reliability is a flaming dumpster fire. And everything you commit there will be used to train more AI.
Github _is_ sinking.
If load has increased so much so rapidly then GitHub should be rate limiting as needed instead of basically letting people DoS them.
Github had lots of outages even before AI was introduced.
And why is it wrong? The logic is there:
- Microsoft committed to AI. - AI slop is increasing the costs for maintaining/running GitHub. - GitHub is sinking.
This is interconnected. I can think of numerous other ways how this would be handled. But Microsoft went the AI slop way already. There is no way back for them.
We want to thank you for your heroic service in our defense, sir. We really need people like you who know in what side they're at.
Microsoft investors
I went to look at a repo on Github today. Clicked on the "xxx commits" link to see the commit history, and got told I've hit a secondary rate limit and need to wait.
I'm the only person on this network that would even look at Github, and my connection has a dedicated IP, no CGN.
The only real way to browse the site is to be logged in.
They will gradually authwall everything they can. Just look at linkedin.
Wouldn't this break Go and other build systems (npm?) that pull packages from github by default? Not that I endorse the practise, but will Microsoft really kick out such a big class of users?
It does break it, from experience authorizing the pulls with a bot user fixes it.
In the case were the build happens from a github action there are standard builtin credential (workflow permissions).
https://docs.github.com/en/rest/using-the-rest-api/rate-limi...
Can't count the times a "nuget restore" in our CI fails with 401, just to succeed on a 2nd attempt a few seconds later. Seems like the IP range is somehow flagged, so there's definetly a downside to it.
sadly even that isn't an option for me. i spent half an hour yesterday trying to create a github account. i couldn't. my @proton.me got rejected. captchas take several painful minutes to complete. and even when I did manage to create an account (at least the page displayed a success mesage), it got disabled the instant I logged in for "TOS violation". i wish i was joking, but i literally cannot create a github account. a few years ago this would have seemed crazy. but here we are.
i'm stuck having to use google (another pain in the ass) for discovering codebases that contain specific snippets. but some repo contents (such as wikis) are not exposed at all to search engines
Exactly the same here. I get that regularly.
If you're on the desktop, refresh the page cache by using Ctrl + Shift + R
The page will load correctly
Yeah this is just typical techbro gaslighting. There is no rate-limit and hasn't been for years (it's just default deny), but they refuse to change the wording to reflect.
Would you care to cite your source that GitHub does not apply rate limits to unauthenticated requests?
The parent's experience which mirrors my own - on a clean residential IP that hasn't sent any traffic I hit that "rate-limit" on my first request to the commits list view.
So there is no rate-limit, it's a default deny for unauthenticated requests... which could be fine but at least update the error message to reflect that.
It's a rate limit of 0 RPS to that endpoint
“GitLab - enterprise grade, meaning it’s bloated and confusing but it’ll impress your boss. This could be the choice if you need multiple meetings to make the choice.“
lol!
We use gitlab at work, and I have to say that it is disappointing. The UI is so complicated to do the simplest things (e.g. to approve a MR you need to click a button that is actually a menu; the diffs are difficult to read; the 'To-do list' includes MRs that were already merged (how is that actionable?)) and it seems that they're struggling to turn around improvements quickly. The issue around the 'To-do list' including MRs that were already merged was raised years ago.
I also have to say that I'm surprised about the backlash against bitbucket. I find the UI incredibly simple and clear, as do all of the new joiners. With Script Runner you can do some pretty amazing things. It handles the huge repo's well too.
> to approve a MR you need to click a button that is actually a menu
It's not really any better on Github. Why do I have to click on "Files changed" to approve a PR? Overall I would say it is on par with Github. Worse in some ways; better in others.
try to find the issue boards.. its a mess. And expensive.
Funny but not really true. It's not really any more bloated or confusing than Github. It's not really "enterprise grade" software. If you want that look at Jira or anything Microsoft produces.
For $5 a month I can host a server and put a bunch of projects on there. Yeah, I don't have a million stars on my repos but it works for what I need and I can give access to whoever I want.
I'm not sure what to make of the graph.
On the one hand the acquisition of GitHub may have caused the availability to be worse.
On the other hand, the 100.00% availability before the acquisition looks suspicious, wondering if it's not just the status page being better updated.
(I'm aware of the recent availability problems with GitHub, but on the graph the problems start in 2020 and don't seem to worsen significantly)
It sort of feels like no major open source repository can be possibly left well enough alone. I remember how SourceForge went down the drain, it's a real pity to see same happen with GH.
Side note: I read the URL as "dBus hell". We've all been there m8
No m80 it's a nushell based on decibel units dBu Shell
I often think about how I’d do it if I ran my own company.
I would really like to see what it would be like doing all code reviews over email. The repo would just be a simple vps-style server with git-only ssh access, there’d be a particular for-review/ branch namespace for code to be reviewed, and CI would just be a bot waiting for branches to show up and would mark refs as good or not by just annotating/tagging them. It could reply in the email thread with results too.
The mailing list would have a web archive viewer, naturally. That’s how you could look at old reviews. There’s tons of existing solutions for this, and it’s just html.
Chat would be on IRC with bots to archive the channels. Easy as hell.
The whole thing (except maybe the CI runners which need beefier hardware) could be done on a very cheap server.
GitHub is waaay over engineered for what you need to run a software project. Look at the Linux kernel, they just use a simple mailing list, and it’s debatably the most successful software project of all time.
Issue/bug tracking is scarier though. Because I’d probably want to yak shave my own solution and get too involved with that and not even focus on what the company does. Maybe it could be a bug tracking software company?
Ideally I would want the code review to be versioned as well with easily accessible history. That is, I would like to see the exact lines which a comment pertains to and when they were changed and switch back and forth. While e-mail is probably good enough as a protocol to exchange this data, the e-mail client is not a good way to view it in my opinion. Maybe we need a decentralised code review system as well.
Github is struggling because AI-boosted coding increased the number of commits 14x in the past year, and the pace is still accelerating. The site is struggling to keep up.
Github's COO confirms it here: https://x.com/kdaigle/status/2040164759836778878
Platform activity is surging. There were 1 billion commits in 2025. Now, it's 275 million per week, on pace for 14 billion this year if growth remains linear (spoiler: it won't.)
GitHub Actions has grown from 500M minutes/week in 2023 to 1B minutes/week in 2025, and now 2.1B minutes so far this week.
So we're pushing incredibly hard on more CPUs, scaling services, and strengthening GitHub’s core features.
I like the "written by human" banner at the bottom - that's a first for me and will be glad to see others adpot similar.
>Written by human All opinions are my own and not those of a large language model. Everything I write is one hundred percent human. Because I care!
Living in Eastern Europe has its perks. I hardly ever notice big GitHub outages because of time zone.
I'm also happy with how generous their free hosting and actions are.
So, what's the actual real alternative ? The one that also supports open source projects ? Ironically gitlab is costlier than github, and not without their faults, but that's "maybe" the only other alternative here, anything else ?
I just installed a gitea. It seems decent.
It absolutely is.
The only concerns are if it were exposed to the public internet and scale. For personal stuff? It's spectacular.
Codeberg, Sourcehut, or self hosted Gitea.
have you read the submission? have of it is a list of alternatives
I've been running a self-hosted Forgejo. Extremely responsive and I've been really happy with it.
How many PRs from other people have you received on your self-hosted Forgejo instance?
None and I don't want to.
If I did I would host it on a VPS and make it public.
> How many PRs from other people have you received on your self-hosted Forgejo instance?
They're free to email me a diff :-)
Jokes aside, the era of community built software is coming to an end. There is no place in the world now for a repository of open source projects.
I installed forgejo on my home server and never looked back. The only problem I face is when hosting an app on DigitalOcean App platform, or vercel etc. They only connect to GitHub.
> DigitalOcean App platform[…] only connect to GitHub
They also support deployments from GitLab (so long as you're using the gitlab.com-hosted instance and not a self-hosted GitLab instance). If you've deployed your own self-hosted forge, then you can connect DigitalOcean App Platform to it by using gitlab.com as a bridge—register an account on gitlab.com once and instruct your self-hosted forge to replicate copies to gitlab.com. You don't really need to actually use GitLab.
Having said that, considering that DigitalOcean is in the business of selling IaaS/PaaS, it's loony that they don't let you connect to, say, your own self-hosted Forgejo running on their infrastructure…
(Indeed, considering how many people would like to self-host their own forge but how few people want to actually set up and do admin for it, it's loony that DigitalOcean doesn't pick up, say, Forgejo and/or an alternative and offer a sharply discounted (e.g. $20/year) quasi-managed one-click deployment option with first-class support for connecting to their App Platform.)
All of the reasons to avoid GitHub are also reasons to avoid the Digital Ocean App platform and Vercel. I use Digital Ocean, but just the VPSs. Don't let yourself get vendor locked in with these middle men, retain control and shoot for the most universal level of the stack you can.
It's just a step. I will eventually move to coolify, just haven't had time to set it up. But the problem stands: coolify also doesn't connect to forgejo.
I’m in a similar boat, I abandoned ship for Gitea years ago (prior to forgejo fork) and have no regrets.
For things that require GitHub I’ve been able to mirror repos there and get things working. Keeping code in sync is annoying though.
Similar situation with Apple's Xcode Cloud.
Have never developed on/for apple platform, so I have no clue. Apple makes setting up development so hard, I wonder what motivates developers to jump through all the hoops.
> Apple makes setting up development so hard, I wonder what motivates developers to jump through all the hoops.
Money to be made. And they have (had) nice API for most development needs. The actual distribution is a arduous though, mostly around the Review process.
Github still doesn't support SHA-256 git repos (https://github.com/orgs/community/discussions/12490) even though their competitors (Gitlab, Codeberg) have that for ages now.
"If Linux can be maintained by sending patches to an email mailing list, “doesn’t work at scale” arguments are skill issues."
Agreed. Sick of the bloatware.
The issue is every AI coding tool integrates as a "GitHub App" (OAuth, PATs, webhooks etc.) first, over other code forges. This load is coming through their 3rd party app integrations. I bet the web/git volume isn't getting smashed as much.
I had to begrudgingly use GitHub over my preference GitLab to use some 3rd party AI features.
The solution for GitHub is to charge or rate-limit some of these 3rd parties integrations and come up with an equitable solution.
I would not be surprised if AI commits are the culprit. There is no way any service would cope with a constant stream of unfettered commits by sleepless always-on agents. Ironically, this same strategy seems to be what GH/MS (and other big companies) are evangelizing - and therefore dying by their own hand (in a way).
Yes this is confirmed. Github activity surged about 10x in the past year: https://x.com/kdaigle/status/2040164759836778878
Platform activity is surging. There were 1 billion commits in 2025. Now, it's 275 million per week, on pace for 14 billion this year if growth remains linear (spoiler: it won't.)
GitHub Actions has grown from 500M minutes/week in 2023 to 1B minutes/week in 2025, and now 2.1B minutes so far this week.
So we're pushing incredibly hard on more CPUs, scaling services, and strengthening GitHub’s core features.
AI commits are definitely causing the recent issues with their platform as Github has seen an unprecedented amount of traffic since the introduction of OpenClaw. They are likely showing us a future problem that other sites have not experienced yet as the tools have not matured in a way for people to be able to adopt them towards their eventual vision for managing someone entire digital life.
That being said, Microsoft in general in relation to GitHub has shown at least historically they've caused more issues. Outages on Github have become so common place that I genuinely think people have simply gotten used to it. The recent round of these were just bad enough where people felt strong enough to make their own down status page https://mrshu.github.io/github-statuses/. Whether you agree with how they gathered their data, there is something being felt by the community that Microsoft is not being transparent with these issues.
I left Github because of some very strange activity. I had a new folder named "feature" added to one of my repos. At the time I had failed to turn off the AI integrations, so I figured that was the problem.
There is no way I'm going to let a VCS put code into one of my repos without my asking for it or consent. Full stop. I moved all my significant code to codeberg but kept the github account, so my username doesn't get squatted.
One shift in perspective I had was realizing that github is not just a code forge, but a social network.
AFAIK Sourceforge is alive and kicking, and it has an "Import from Github" feature that has been available for years.
Don't they inject malware/adware into your build artifacts?
> Don't they inject malware/adware into your build artifacts?
Aw, c'mon! The did this for about 4 months which ended in 2015! Prior to that they had Windows installers which did the same, but that also only lasted a few months.
It's now 2026. Exactly what software did you host on sourceforge from 2011 to 2015? Because I hosted my GPL stuff there, and I moved away because I was affected, and yet I am not concerned that they will do that again.
Agree with Gitlab as an enterprise alternative. Beautifully boring and safe to have complex teams and permissions. Also has a good enough Terraform support, and a nice workflow to host docker images
Who cares about bugs when their developer velocity has increased 5x!
I wasn't expecting to see the outages being nearly the same even before the 2023 ai inflection point
Anyone would buckle right now. Microsoft just sucks more at it.
I think they went too far with AI internally. Complete collapse in quality of internal engineering practices.
People used GitHub's free infrastructure for over a decade without complaining. Now AI-generated spam and massive amounts of low-quality code are increasing costs everywhere, and suddenly GitHub is the bad guy for acting like a business. Criticizing centralization is fair, but pretending GitHub gave nothing to open source is just dishonest. Alternatives today, are probably going to be flooded by the low-quality AI generated code tomorrow...
onedev onedev ondev
I still don't see this tool when it's about a forge. It is a fantastic tool. Seriously guys, you should really consider it !
I'm intrigued as to why you're getting downvotes here, I'm vaguely interested in onedev.
Edit: Though I do think they're mad for not offering a hosted version, especially right now while GitHub resentment is riding high.
There is a self-hosted version of onedev, it's oss. But there is also a enterprise version (I have one) with very nice plugins most of forge doesn't still have (like the web terminal to debug oui action/ci/cd).
Last week, Robin release a very nice feature for vibe coder. AWESOME
Related:
Ghostty is leaving GitHub
https://news.ycombinator.com/item?id=47939579
Before GitHub
https://news.ycombinator.com/item?id=47940921
Days without GitHub incidents
https://news.ycombinator.com/item?id=48012022
GitHub Actions is the weakest link
https://news.ycombinator.com/item?id=47933257
GitHub Copilot is moving to usage-based billing
https://news.ycombinator.com/item?id=47923357
It will be a long way before GitHub dies, but it is definitely sinking. Slop is killing it. I think Microslop, 'xcuse me, Microsoft, realises this too, but there is nothing they can do now that they committed to AI fully. I feel sad for the GitHub engineers, because they write pointless blog entries nobody believes anymore. Meanwhile existing services erode in quality. It's like in a submarine. You have one hole. You manage it. Well, more and more holes pop up the following days. We know where this is headed then ...
I have lost count of how many times something went down on GitHub ever since documenting it on this comment chain [0] and also predicting 6 years ago [1], that going all in and centralizing everything on to GitHub was really not a good idea if you need stability or to push a critical fix and your GitHub actions doesn't work.
Now, are you going to finally self host or should we continue to expect another outage on GitHub?
This time, there is no CEO of GitHub to help us. It is Copilot, and Tay.ai that are still struggling to maintain GitHub.
[0] https://news.ycombinator.com/item?id=37395238
[1] https://news.ycombinator.com/item?id=22867803
Why do I keep seeing people blaming Tay.ai? That was a one-off Twitter chatbot that was shut down a decade ago.
Extinguish