The quant fund use case is the most interesting angle here. WARN filings have the rare property of being legally mandated with specific timing (60-day advance notice), which makes the signal horizon predictable in a way that most alternative data is not.
The big caveat: compliance is uneven. Companies under 100 employees are exempt, and there is a documented pattern of employers paying WARN Act penalties retroactively rather than filing -- especially in fast-moving situations where 60 days advance notice is operationally inconvenient. So the signal has systematic gaps at exactly the moments of highest market interest.
Have you looked at coverage rates vs. announced layoffs (e.g., correlation with Challenger Gray reports or JOLTS)? That gap number is basically the signal noise floor for any quant strategy built on this data.
This is great insights @vicchenai. I just made it in a whim. I asked Gemini for suggestions and it presented few and I picked this one because there didn't seem to be good players out there. There is layoffdata and warntracker and all but they were built using hand coding :) it probably took them quite long to have it done. This is a challenging thing - 50 states having different data formats and having to poll them daily. Data cleaning is another thing. Historical data is also a thing. Plus all the blogs, reports, data presentation and api endpoints.
If it all can be done in a weekend without having to write a single line manually, sky is the limit no? Anybody having an idea can make these thing happen. I am not expecting it to make any money and it was a learning project but I do see some value for certain people.
Insurance brokers would benefit if they are first to know so they coudl target these layed off people. Recruiters the same and definitely hedge funds, short shellers and quant. Gemini tells me data is the new oil ! I am convinced.
> and there is a documented pattern of employers paying WARN Act penalties retroactively rather than filing -- especially in fast-moving situations where 60 days advance notice is operationally inconvenient.
Oh, I got a solution for that. Don't just go for WARN Act penalties. Go after offenders with the hammer called SEC and market manipulation regulations. That kind of stuff really hurts.
This dataset looks interesting but the site doesn’t instill a lot of confidence in data integrity.
On the Charts page the selected time range is 12/01/2025 to 02/28/2026 and shows 106,603 employees affected. But the horizontal bar chart with state level data shows numbers in millions. For example, CA has more than 2 million and IL has more than 1.7 million employees affected. Then the layoff map at the bottom shows only layoffs in Texas.
You can think about LLM-generated UIs/apps the same way you think about LLM-generated responses. It's a bunch of garbage, but if you know what you're looking for, you might find something useful.
This doesn't seem to work at all for stats-related apps/sites though, since you can't judge the accuracy of what's being presented. If the site claims it'll "take you to space," you don't take that literally, you just treat it as another AI artifact. But with numbers, you have no way to tell what's accurate and what's just made up.
Thats such a great point. What if I am (somehow) able to warrant that the data presented here exactly matches the state notices? would that be helpful?
> It's a bunch of garbage, but if you know what you're looking for, you might find something useful.
If you mean an LLM can be a brainstorming and hypothesis machine, and you have prior expertise to evaluate the proposals, then I can see that value. (Maybe that's what you meant, of course.)
But prior expertise is absolutely necessary. Otherwise we make ourselves victims of mis/disnformation. People say the Internet is a cesspool of mis/disinfo, yet nobody thinks it could affect them - we're all too smart, of course (no really, I'm the exception!). [0]
> This doesn't seem to work at all for stats-related apps/sites though, since you can't judge the accuracy of what's being presented.
I don't see the difference. If it's obvious nonsense, in numbers or in text, it's detectable. Everything else, see above.
[0] Research shows that thinking is a big reason people get fooled, and better educated people are easier to fool.
great perspective @mmooss. There is a lot to grasp but I am paying attention. Do you think there is a value to these data if presented accurately, timely and with good analytics?
Interesting, though a lot of the UI seems broken. For my state I see some notice dates in the future (it's not explained why, if this is when the filing will be executed or if it's an incorrect filing date, as the column is just "Notice Date")
Some of the entries pull up a page that says "Failed to load company data: No company name provided in URL" from the state specific view (e.g, any link on https://warnfirehose.com/data/layoffs/california ). Has a vibe-coded feel to it.
I saw a lot of "Purchase dataset for city details" in places which was annoying. Wondering how much processing is being done on the base dataset to justify the pricing. Could you explain a bit on the normalization/cleaning process?
Definitely vibe coded. It follows the same generic Claude UI patterns for a data app / data oriented website. Not necessarily a bad thing per say if it's still curated and tweaked with human taste. And ofc validated to work :)
Hi @skadamat - I am not very experienced with claude code and don't know what it has done previously but new opus 4.6 is great. I think I need to work on claude.md. That really seems to be the soul of Claude. I've incurred quite a lot of money on API cost and claude plan and my time. But I feel like that can be cut to 1/4th easily with a solid claude.md plan.
@cobertos The UI things should have been addressed. However I did to an extensive testing myself and everything seemed to have worked for me. I am sold that vibe coding is a thing, seriously. To your question about normalization cleaning - I have gone through multiple iterations - the data is definitely not clean and all. However, claude has made a function (instead of cleaning them on demand, it has a function which gets updated with every pattern so it could be reused).
For the pricing, to be honest I have zero hope to make money. Its just out there because I wanted to integrate this with stripe for payments and all. At the same time, if you look at competitor sites, they seem crap lol. plus look at their pricing. The site does have some operating cost and I will have to recover them if i can if I want it to be self sustainable but only if this is a value to someone. I am trying to make this a value. Please offer if you have ideas and I would appreciated it :)
Thank you for checking the work and all the comments which are very very valuable. Site does quite a lot to be honest so please take some time to explore more :)
- It has over 15k individual landing pages for search engines - a dedicated page for each city, state, company, county. These pages are very reach on how they look haha:
- Exmaple: new jersey page-> https://warnfirehose.com/data/layoffs/new-jersey
- Data is exportable to multiple format including json-d and parquet - i never heard of parquet before.
- The site has MCP (model context protocol) built in !
- it accepts payments on multiple methods including paypal, apple pay, amazon pay, card etc. All refund and all built in.
- It also has an invisible admin dashboard where I can see everything (all metrics, signups payments etc.)
- Reports are crazy - these are genuinely better than someone writing at wallstreet journal, seriously. you would have to check to see them.
- The api endpoints.
I am convinced the role Software engineer is going to go away by this year end. We are turning into builders, not coders. All SaaS apps are going to take big hit because they can be builkt better for fraction of cost by anyone - just think about it.
Great site thank you. Just curious, I looked up my company(more than 40k employees across the world including many US states) and it seems like I am not seeing the layoffs that colleagues have experienced. This is probably expected as im probably missing some criteria. Do all layoffs have to have a WARN notice or are there mechanisms/criteria that allow companies to lay people off without filing these noticies?
I am agreeing with two comments. There might be many reasons. Just the fact that there are only 12M records in last 30 years or so for all 50 states it definitely doesn't represent all data - may be 10%
The problem is WARN only fires when companies actually fire people. AI displacement doesn't really work that way, at least not yet. Customer support new hires dropped 65% in eight quarters (https://www.saastr.com/customer-support-hiring-has-fallen-65...).
you should just be able to go to data -> all records -> type company name.
By the way, if the company is public, it brings up stock ticker, SEC link and all layoff related news. Plus all historical WARN notices by that company.
The quant fund use case is the most interesting angle here. WARN filings have the rare property of being legally mandated with specific timing (60-day advance notice), which makes the signal horizon predictable in a way that most alternative data is not.
The big caveat: compliance is uneven. Companies under 100 employees are exempt, and there is a documented pattern of employers paying WARN Act penalties retroactively rather than filing -- especially in fast-moving situations where 60 days advance notice is operationally inconvenient. So the signal has systematic gaps at exactly the moments of highest market interest.
Have you looked at coverage rates vs. announced layoffs (e.g., correlation with Challenger Gray reports or JOLTS)? That gap number is basically the signal noise floor for any quant strategy built on this data.
This is great insights @vicchenai. I just made it in a whim. I asked Gemini for suggestions and it presented few and I picked this one because there didn't seem to be good players out there. There is layoffdata and warntracker and all but they were built using hand coding :) it probably took them quite long to have it done. This is a challenging thing - 50 states having different data formats and having to poll them daily. Data cleaning is another thing. Historical data is also a thing. Plus all the blogs, reports, data presentation and api endpoints.
If it all can be done in a weekend without having to write a single line manually, sky is the limit no? Anybody having an idea can make these thing happen. I am not expecting it to make any money and it was a learning project but I do see some value for certain people.
Insurance brokers would benefit if they are first to know so they coudl target these layed off people. Recruiters the same and definitely hedge funds, short shellers and quant. Gemini tells me data is the new oil ! I am convinced.
> and there is a documented pattern of employers paying WARN Act penalties retroactively rather than filing -- especially in fast-moving situations where 60 days advance notice is operationally inconvenient.
Oh, I got a solution for that. Don't just go for WARN Act penalties. Go after offenders with the hammer called SEC and market manipulation regulations. That kind of stuff really hurts.
This dataset looks interesting but the site doesn’t instill a lot of confidence in data integrity.
On the Charts page the selected time range is 12/01/2025 to 02/28/2026 and shows 106,603 employees affected. But the horizontal bar chart with state level data shows numbers in millions. For example, CA has more than 2 million and IL has more than 1.7 million employees affected. Then the layoff map at the bottom shows only layoffs in Texas.
Agreed, this doesn't pass my smell test. November 2026 reports layoffs in NJ (with past tense) https://warnfirehose.com/blog/2026/11/week-1
The layoffs in the report are not listed in NJ's own warn notice https://www.nj.gov/labor/assets/PDFs/WARN/2026_WARN_Notice_A...
Thank you and I will have them addressed. There are many things in terms of data quality and I am still working on it.
Thank you @malshe. I will review and get it addressed. The site is not perfect at all and I will address them.
You can think about LLM-generated UIs/apps the same way you think about LLM-generated responses. It's a bunch of garbage, but if you know what you're looking for, you might find something useful.
This doesn't seem to work at all for stats-related apps/sites though, since you can't judge the accuracy of what's being presented. If the site claims it'll "take you to space," you don't take that literally, you just treat it as another AI artifact. But with numbers, you have no way to tell what's accurate and what's just made up.
Thats such a great point. What if I am (somehow) able to warrant that the data presented here exactly matches the state notices? would that be helpful?
> It's a bunch of garbage, but if you know what you're looking for, you might find something useful.
If you mean an LLM can be a brainstorming and hypothesis machine, and you have prior expertise to evaluate the proposals, then I can see that value. (Maybe that's what you meant, of course.)
But prior expertise is absolutely necessary. Otherwise we make ourselves victims of mis/disnformation. People say the Internet is a cesspool of mis/disinfo, yet nobody thinks it could affect them - we're all too smart, of course (no really, I'm the exception!). [0]
> This doesn't seem to work at all for stats-related apps/sites though, since you can't judge the accuracy of what's being presented.
I don't see the difference. If it's obvious nonsense, in numbers or in text, it's detectable. Everything else, see above.
[0] Research shows that thinking is a big reason people get fooled, and better educated people are easier to fool.
great perspective @mmooss. There is a lot to grasp but I am paying attention. Do you think there is a value to these data if presented accurately, timely and with good analytics?
Interesting, though a lot of the UI seems broken. For my state I see some notice dates in the future (it's not explained why, if this is when the filing will be executed or if it's an incorrect filing date, as the column is just "Notice Date")
Some of the entries pull up a page that says "Failed to load company data: No company name provided in URL" from the state specific view (e.g, any link on https://warnfirehose.com/data/layoffs/california ). Has a vibe-coded feel to it.
I saw a lot of "Purchase dataset for city details" in places which was annoying. Wondering how much processing is being done on the base dataset to justify the pricing. Could you explain a bit on the normalization/cleaning process?
Definitely vibe coded. It follows the same generic Claude UI patterns for a data app / data oriented website. Not necessarily a bad thing per say if it's still curated and tweaked with human taste. And ofc validated to work :)
Hi @skadamat - I am not very experienced with claude code and don't know what it has done previously but new opus 4.6 is great. I think I need to work on claude.md. That really seems to be the soul of Claude. I've incurred quite a lot of money on API cost and claude plan and my time. But I feel like that can be cut to 1/4th easily with a solid claude.md plan.
@cobertos The UI things should have been addressed. However I did to an extensive testing myself and everything seemed to have worked for me. I am sold that vibe coding is a thing, seriously. To your question about normalization cleaning - I have gone through multiple iterations - the data is definitely not clean and all. However, claude has made a function (instead of cleaning them on demand, it has a function which gets updated with every pattern so it could be reused).
For the pricing, to be honest I have zero hope to make money. Its just out there because I wanted to integrate this with stripe for payments and all. At the same time, if you look at competitor sites, they seem crap lol. plus look at their pricing. The site does have some operating cost and I will have to recover them if i can if I want it to be self sustainable but only if this is a value to someone. I am trying to make this a value. Please offer if you have ideas and I would appreciated it :)
Thank you for checking the work and all the comments which are very very valuable. Site does quite a lot to be honest so please take some time to explore more :)
- It has over 15k individual landing pages for search engines - a dedicated page for each city, state, company, county. These pages are very reach on how they look haha: - Exmaple: new jersey page-> https://warnfirehose.com/data/layoffs/new-jersey - Data is exportable to multiple format including json-d and parquet - i never heard of parquet before. - The site has MCP (model context protocol) built in ! - it accepts payments on multiple methods including paypal, apple pay, amazon pay, card etc. All refund and all built in. - It also has an invisible admin dashboard where I can see everything (all metrics, signups payments etc.) - Reports are crazy - these are genuinely better than someone writing at wallstreet journal, seriously. you would have to check to see them. - The api endpoints.
I am convinced the role Software engineer is going to go away by this year end. We are turning into builders, not coders. All SaaS apps are going to take big hit because they can be builkt better for fraction of cost by anyone - just think about it.
- Its all about ideas and anyone can do it.
Great site thank you. Just curious, I looked up my company(more than 40k employees across the world including many US states) and it seems like I am not seeing the layoffs that colleagues have experienced. This is probably expected as im probably missing some criteria. Do all layoffs have to have a WARN notice or are there mechanisms/criteria that allow companies to lay people off without filing these noticies?
I am agreeing with two comments. There might be many reasons. Just the fact that there are only 12M records in last 30 years or so for all 50 states it definitely doesn't represent all data - may be 10%
Many companies pay severance in lieu of giving notice.
If the layoff is small enough, it's not triggered.
First of all, nice idea and good execution!
The problem is WARN only fires when companies actually fire people. AI displacement doesn't really work that way, at least not yet. Customer support new hires dropped 65% in eight quarters (https://www.saastr.com/customer-support-hiring-has-fallen-65...).
Those roles just stopped being posted..
How do I filter by Company Name?
you should just be able to go to data -> all records -> type company name.
By the way, if the company is public, it brings up stock ticker, SEC link and all layoff related news. Plus all historical WARN notices by that company.
Anyone remember this gem from the "dot com" crash? http://fuckedcompany.com/
lol :) I love the name ! may be the name is worth a thousand haha
Now we need a usable website that uses this API to show latest layoffs in, for example, CA.