Also, don't forget to set up an RSS or Atom feed for your website. Contrary to the recurring claim that RSS is dead, most of the traffic to my website still comes from RSS feeds, even in 2̶0̶2̶5̶ 2026! In fact, one of my silly little games became moderately popular because someone found it in my RSS feed and shared it on HN. [1]
From the referer (sic) data in my web server logs (which is not completely reliable but still offers some insight), the three largest sources of traffic to my website are:
1. RSS feeds - People using RSS aggregator services as well as local RSS reader tools.
2. Newsletters - I was surprised to discover just how many tech newsletters there are on the Web and how active their user bases are. Once in a while, a newsletter picks up one of my silly or quirky posts, which then brings a large number of visits from its followers.
3. Search engines - Traffic from Google, DuckDuckGo, Bing and similar search engines. This is usually for specific tools, games and HOWTO posts available on my website that some visitors tend to return to repeatedly.
Please also enable CORS[1] for your RSS feed. (If your whole site is a static site, then please just enable CORS site-wide. This is how GitHub Pages works. There's pretty much no reason not to.)
Not having CORS set up for your RSS feed means that browser-based feed readers won't be able to fetch your feed to parse it (without running a proxy).
If you want to get a red line, you need to use red ink. If you use blue ink, you'll get blue lines. And I can draw you cat. (I'm no artist, but I can give it a try.) But it it won't be a line anymore. A line and a cat: those are two different things.
RSS is my preferred way to consume blog posts. I also find blogs that have an RSS feed to be more interested in actually writing interesting content rather than just trying to get views/advertise. I guess this makes sense—hard to monetize views through an RSS reader
It's funny back in the Google Reader days monetizing via RSS was quite common. You'd publish the truncated version to RSS and force someone to visit the site for the whole version, usually just in exchange for ad views. Honestly while it wasn't the greatest use of RSS it was better than most paid blogs today being ad-wall pop-up pay-gate nightmares of UX.
Now that browser developers did their best to kill RSS/Atom...
Does a Web site practically need to do anything to advertise their feed to the diehard RSS/Atom users, other than use the `link` element?
Is there a worthwhile convention for advertising RSS/Atom visually in the page, too?
(On one site, I tried adding an "RSS" icon, linking to the Atom feed XML, alongside all the usual awful social media site icons. But then I removed it, because I was afraid it would confuse visitors who weren't very Web savvy, and maybe get their browser displaying XML or showing them an error message about the MIME content type.)
I use RSS Style[1] to make the RSS and Atom feeds for my blog human readable. It styles the xml feeds and inserts a message at the top about the feed being meant for news readers, not people. Thus technically making it "safe" for less tech savvy people.
Browsers really should have embraced XSLT rather that abandoned it. Now we're stuck trying yet again to reinvent solutions already handled by REST [1].
XSLT is the solution domain specialists and philosophers. Abandoning it is the vote of the market and market interests, the wisdom of crowds at work. This is the era of scale not expertise, enjoy the fruits.
Effectively no one was using XSLT at any point (certain document pipelines or Paul Ford like indie hackers being the exceptions that proved the rule). Browsers keep all kinds of legacy features, of course, and they could well have kept this one, and doing so would’ve been a decision with merit. But they didn’t, and the market will ratify their decision. Just like effectively no one was using XSLT, effectively no one will change their choice of browser over its absence.
Its hard to judge usage when browsers stopped maintaining XSLT with the 1.0 spec. V1.0 was very lacking in features and is difficult to use.
Browsers also never added support for some of the most fundamental features to support XSLT. Page transitions and loading state are particularly rough in XSLT in my experience.
Blizzard used to use it for their entire WoW Armory website to look people up, They converted off it years ago, but for awhile they used XML/XSLT to display the entire page
RSS.style is my site. I'm currently testing a JavaScript-based workaround that should look just like the current XSLT version. It will not require the XSLT polyfill (which sort-of works, but seems fragile).
One bonus is that it will be easier to customize for people that know JavaScript but don't know XSLT (which is a lot of people, including me).
You'll still need to add a line to the feed source code.
> message at the top about the feed being meant for news readers
There's no real reason to take this position. A styled XML document is just another page.
For example, if you're using a static site generator where the front page of your /blog.html shows the most recent N posts, and the /blog/feed.xml shows the most recent N posts, then...?
Shout out to Vivaldi, which renders RSS feeds with a nice default "card per post" style. Not to mention that it also has a feed reader built in as well.
Isn't ironic that browsers do like 10,000 things nowadays, but Vivaldi (successor to Opera) is the only one that does the handful of things users actually want?
I don't use it myself because my computer is too slow (I think they built it in node.js or something). But it makes me happy that someone is carrying the torch forward...
With the lack of styling, I'm sorry to say I didn't notice the RSS icon at first at all. Adding the typical orange background to the icon would fix that.
For a personal site, I'd probably just do that. (My friends are generally savvy and principled enough not to do most social media, so no need for me to endorse it by syndicating there.)
But for a commercial marketing site that must be on the awful social media, I'm wondering about quietly supporting RSS/Atom without compromising the experience for the masses.
Is there any reason today to use RSS over Atom? Atom sounds like it has all the advantages, except maybe compatibility with some old or stubborn clients?
Based on my own personal usage, it makes total sense that RSS feeds still get a surprising number of hits. I have a small collection of blogs that I follow and it's much easier to have them all loaded up in my RSS reader of choice than it is to regularly stop by each blog in my browser, especially for blogs that seldomly post (and are easy to forget about).
Readers come with some nice bonus features, too. All of them have style normalization for example and native reader apps support offline reading.
If only there were purpose-built open standards and client apps for other types of web content…
This is what I use. It’s on macOS too and amazing on both. Super fast, focused, and efficient.
It’s by far the best I’ve tried. Most other macOS readers aren’t memory managing their webviews properly which leads to really bad memory leaks when they’re open for long periods.
iCloud sync is a nice feature too. I use the Mac app mostly for adding feeds and the iOS app for reading. Anytime I read an interesting web post, I pop its url into the app to see if it has a RSS feed.
Same question, but for Android and desktop / laptop too. Never used RSS much before, hardly, in fact, I don't know why, even though I first knew about it many years ago, but after reading this thread, I want to.
The question is, do you have this traffic because of RSS client crawlers that pre-loaded the content or from real users. I'm not pro killing RSS by the way, but genuinely doubtful.
> The question is, do you have this traffic because of RSS client crawlers that pre-loaded the content or from real users.
I have never seen RSS clients or crawlers preload actual HTML pages. I've only seen them fetching the XML feed and present its contents to the users.
When I talk about visitors arriving at my website from RSS feeds, I am not counting requests from feed aggregators or readers identified by their 'User-Agent' strings. Those are just software tools fetching the XML feed. I'm not talking about them. What I am referring to are visits to HTML pages on my website where the 'Referer' header indicates that the client came from an RSS aggregator service or feed reader.
It is entirely possible that many more people read my posts directly in their feed readers without ever visiting my site, and I will never be aware of them, as it should be. For the subset of readers who do click through from their feed reader and land on my website, those visits are recorded in my web server logs. My conclusions are based on that data.
> I have never seen RSS clients or crawlers preload actual HTML pages
Some setups like ttrss with the mercury plugin will do that to restore full articles to the feed, but its either on-demand or manually enabled per feed. Personally I dont run it on many other than a few more commercial platforms that heavily limit their feed's default contents.
Presumably some the more app based rss readers have such a feature, but I wouldnt know for certain.
I do not deliberately measure traffic. And I certainly never put UTM parameters in URLs as a sibling comment mentioned, because I find them ugly. My personal website is a passion project and I care about its aesthetics, including the aesthetics of its URLs, so I would never add something like UTM parameters to them.
I only occasionally look at the HTTP 'Referer' header in my web server logs and filter them, out of curiosity. That is where I find that a large portion of my daily traffic comes via RSS feeds. For example, if the 'Referer' header indicates that the client landed on my website from, say, <https://www.inoreader.com/>, then that is a good indication that the client found my new post via the RSS feed shown in their feed aggregator account (Inoreader in this example).
Also, if the logs show that a client IP address with the 'User-Agent' header set to something like 'Emacs Elfeed 3.4.2' fetches my '/feed.xml' and then the same client IP address later visits a new post on my website, that is a good indication that the client found my new post in their local feed reader (Elfeed in this example).
I find browsing and discovering fun. So, after years of lurking I decided to make my own directory. It is called Top Four (https://topfour.net).
A /top4 page is a personal webpage where you can share your definitive ranked list of your top 3 favorites and an honorable mention. In a specific topic, such as movies, albums, snacks, games, or anything else you feel strongly about. Or read the announcement: https://peterspath.net/blog/project-top-four/
We followed this practice at a Non-Profit I volunteered for some years ago. For us, it was motivated by a few reasons:
- we trained the community around us to look to our website first for the most recent news and information
- we did not want a social media platform to be able to cut us off from our community (on purpose or accident) by shuttering accounts or groups
- we did not want to require our users have accounts on any 3rd party platforms in order to access our postings
- but we still wanted to distribute our messaging across any platforms where large groups of our community members frequently engaged
Another aspect of our process that was specific to our situation and outside of POSSE - we only posted one topic/issue/announcement per blog post. We had a news letter that would summarize each of these. Many organizations like ours would post summaries of many things to a single blog post, basically the same as the newsletter. However, this was cumbersome. For example, if someone in the community had a question, it was much clearer to link to a single post on our site that answered the question AND ONLY answered that question. It made for much better community engagement, better search engine indexing, cleaner content management, and just a better experience for everyone involved.
One of the biggest steps down in Facebook history was their removal of RSS syndication. There was a time in the past when you could subscribe your Facebook account to external RSS feeds. The entries in those feeds would create new content on your "Facebook wall". This essentially let you use any third party that supported RSS to publish content into your Facebook feed.
Facebook removed that feature. The effect of this was that people had to create content within facebook instead of outside it. This reoriented the flow of content creation so that it must originate inside of Facebook, removing the ability to use FB as a passive consumer of content created in a workflow where the creators chose the entire flow.
IMHO this is one of the biggest steps down ever in FB history. It was one of the biggest attacks on the open web, and I'm sad to say that it mostly worked, and the internet at large is worse as a result.
That's why I wonder if, deep down, Zuck realizes the walled slop garden he's ultimately created instead of what it looked like he'd set out to create 18-20 years ago.
The problem is, despite all of the slop and garbage and surveillance and everything, Facebook is still actually valuable for many people.
People use it to keep in contact with relatives and friends, I follow work groups, my mother took COPD therapy through Facebook and chatted with relatives in other countries. I think Hacker News has been so radicalized against social media and "algorithms" that they forget most people's relationship with social media is entirely mundane.
Speaking of industry wide shift, how many companies has FB fucked up by proxy?
I refer to the video metrics scandal. How many video autoplay and other things has everyone felt obliged to copy because Zuckerberg (who seems to care about nobody) made FB into a fradulent company?
Remember when Facebook was an application development platform? And people built businesses on that, and then they just kind of stopped allowing that? Good times.
I guess it happens when engineers stop driving decisions and the finance people take over. Won't be too good for the company's valuation if people can access the content elsewhere.
I guess that's why Discord is also locked down as much. They have community content that is inaccessible anywhere else but Discord.
Yeah but Zuck has always been a nasty piece of work. He wasn't "just young" and "grew up" when he wrote those IMs. See: (to list just one) the constant copying of Snapchat
Did he/didn't he steal? Dunno, though there's a fair few bits of evidence in the various lawsuits (Winklevoss, Greenspan)
But if you didn't know about any of that you could make some inferences. Like his neverending "ooh, shiny new thing, want" (and then lie to people along the way, trick Indians into signing up for your internet.org thing)
I was willing to take his side on a few things because the political situation is genuinely unclear and the public has been misled but then right after he wrote the open letter to Jim Jordan about censorship coercion (which was a real problem, I want to see more tech companies talking about it) he does this https://news.ycombinator.com/item?id=42651178
It's kind of like he has raw bits of intelligence but doesn't quite know how to piece it together and besides "his" inventions (even FB) are put together (largely?!) by other people
>But if you didn't know about any of that you could make some inferences. Like his neverending "ooh, shiny new thing, want" (and then lie to people along the way, trick Indians into signing up for your internet.org thing)
Yeah, that trying to trick Indians into that thing (IIRC, it was also called Free Basics or something like, to sound attractive, prolly) became a big issue in India at the time, I remember, although I didn't delve deep into the matter. I think a group of leading Indian freedom activists took on FB in the media and petitioned the government, and it resulted in the whole scheme collapsing.
I don't like this engineers vs. finance people / MBA divide that I see parroted a lot on HN. And obviously, it's parroted by engineers.
Like, all engineers are saints and the other side are all sinners. What crap. Get real, guys. There are all kinds of multicolored and multidimensional people.
Having been on all 3 or 4 or 5 of these sides :) (dev, sysadm, manager, consultant, ...), I have seen that.
Grow up, folks, and enjoy life in all its richness.
I must say, it's delightful to see this on the front page of HN.
A lot of people have been following indieweb POSSE principles for almost 15 years: publishing on their own site and syndicating elsewhere. I built my own platform for it that I used for 11 years, but you can use just about anything.
What's superb about the indieweb principles is that they're as simple as the web itself. It's worth digging into microformats, webmention, micropub, and the other lightweight standards the community has nurtured. It's all really good work that will become even more useful as more people turn away from centralized social media and AI-saturated services towards human websites. The indieweb is a slow burn but a really vibrant, growing, human community.
This strategy is an alternative to PESOS (Publish Elsewhere, Syndicate (to your) Own Site) [0]. I really like this read on the indieweb website, it explains well why adopt this strategy for federation and emphasizes that "Friends are more important than federation", something a lot of nerds and hackers forget when defending their ideals.
You can have both! POSSE to post multiple places. PESOS to pull in anything posted directly in other places, i.e. anything that didn't originate from a POSSE post.
I really like this philosophy. I've been using it for a couple of years now - everything goes on my personal site, then I post links on Mastodon, Bluesky and Twitter and sometimes (if I remember to do so) LinkedIn, plus copy and paste it all into a Substack email every week or so.
I really need to automate it though - hard on Twitter and LinkedIn but still pretty easy for Bluesky and Mastodon.
Have you looked at https://posseparty.com/ as a possible option? Supports integrations with those platforms and more, and "all" it needs is an Atom feed!
Ooh I hadn't seen that. I'm still hung up on character limits - I want to make sure the summary I include isn't truncated with ... and is instead the right length for that particular platform.
I created POSSE Party because I had similar concerns. Truncation and spacing are highly customizable. You can add a posse:post sidecar element containing JSON that formats exact presentation for each platform exactly as you want it. The built-in truncation can be configured at the account level. And how you count characters, naturally, differs by platform, which the app handles pretty well.
If we had stuck with standard semantic web microformats, RSS/Atom syndication, FOAF-ish graphs, URIs for identity but also anonymous pubkey identities with reputation graphs - we could have built an entirely distributed social media graph that worked like email.
But alas, Facebook pushed forward too fast to counter.
There's still a chance, but the software needs to focus on simplicity and ease of use. Publishing blobs of signed content that can be added to anything - HTML pages, P2P protocols, embedded into emails and tweets - maybe we can hijack the current systems and have distributed identity and publishing take over.
I wish that were true but if ease of use is all that mattered, then micro.blog and other “Indieweb in a box” services would be as big as Bluesky, or maybe even at least as big as Mastodon.
The truth is that we’re social creatures and for social products, that means hanging out where other friends are already hanging out. It’s my personal thesis that no matter how matter how much we lower the bar to participate in the indieweb, fediverse, or other non-corporate platforms, it’s going to be inherently niche.
It's fine to be small, but we can still work on lowering the bar further and promoting the good parts.
Then, if there is a viable alternative to big social media, my thesis is that there might come a day when a critical mass has been fed up and finds a viable alternative that's still beautiful but no longer small.
Promoting the good parts is very worthwhile and there’s a thriving scene. A bunch of interesting people talk over zoom and IRL regularly because of events.indieweb.org and we just had our 3rd annual weekend camp here in San Diego a few weeks ago.
I don’t know about the rest of big social media switching away, so I’m personally just focused on appreciating the community that’s been built up already instead of evangelizing. Maybe I’m wrong and something open will go viral, like the new Loops video app.
I agree! Do you know why anonymous pubkey identities with reputation graphs didn't stick, or any examples of it being used today? In my head that would solve one or two of the problems I see with the modern internet.
I know it’s gotten some push back but to be honest I’m fond of the more manual approach that you take on HN.
While I don’t follow nor am I necessarily interested in everything that you cover, I do appreciate the presence of having something like a local “correspondent” around when you do appear to provide trails of supplementary commentary. The lengths that I see you go through to do all of this tastefully and transparently are not unnoticed.
I definitely won't be automating submission to places like HN.
I figure if you chose to follow me on Bluesky/Twitter/Mastodon/LinkedIn there's no ethical issue at all with me automating the process of having my new blog entries show up in my feeds there, as opposed to copying-and-pasting the links by hand.
No, no, perhaps you misunderstood me. I like how you link to your own writing in the discussions here. I don't suspect you to start automating that.
To tell you the truth I came to this actual submission to express my apathy toward the ‘POSSE’ concept but I saw you here and figured that I could somehow voice that feeling while simultaneously making mention of a sharing method that I do find worthwhile and more personable. And not an easy thing to pull off.
How much of your traffic comes from HN as opposed to the other platforms?
I am not so sure. You need to speak in the native voice of each community. A LinkedIn post vs Tweet vs E-Mail are different. You need to get value from the network directly without expecting a click thru. A lot of engagement + authority happens via the network itself
I think it's more accurate to see blogging as a distinct channel from other types of social media + content marketing
That’s a good take and is underrated. It’s what has kept me from completely automating everything in favor of a semi-automated approach instead of doing the “spray and pray” approach of blasting everywhere.
Follow-up comments and engaging with others after posting is big too. People that “syndicate” without actually engaging on each platform are like some weird proselytizers that show up to a house party and hand out flyers to their own weird shindig without talking to anyone there.
I can confirm. I post across 5 platforms and each has its own regulars and its own vibe. The tone of my posts is slightly different on each.
The general idea for me is that I crosspost short messages about what I am current working on, but the actual finished product is self-hosted. Deleting any of the accounts will not result in lost information.
I've restarted blogging last year, going from a handful of blog post to, publishing consistently. All content gets published on my blog first. I've seen an ~8x increase of traffic. I was affected by zero-clicks from Google's AI overview, but the bulk of my traffic now comes from RSS readers.
>the bulk of my traffic now comes from RSS readers.
I don't think this is correct unless you mean strictly the number of HTTP requests to your web server.
You were the 9th most popular blogger on HN in 2025.[0] Your post says you have about 500 readers via RSS. How can that represent more readers than people who read your posts through HN? I'd guess HN brought you about 1M visitors in 2025 based on the number of your front page posts.
You are right, my statement may be a bit misleading or incomplete. The ~500 readers are not just local rss bots, but they include aggregate RSS bots. For example, I see the feedly reporting ~200 subscribers, newsreader reporting 50 subscribers, feedbin, etc. Each of those only have between 1 to 3 ip addresses. So for each RSS bot, there are an arbitrary number of actual users reading. I can't track those accurately.
However, users can click on an RSS feed article and read it directly on my blog. These have a URL param that tells me they are coming from the feed. When an article isn't on HNs frontpage, the majority of traffic is coming from those feeds.
By the way, thank you for sharing this tool. Very insightful.
These are impressive metrics, are you able to make a living off of your 10M views?
I'm planning to leave my job this year and focus on content, mostly have been considering YouTube, but if blogging can work too, might consider that as well
Not even close to making a living! It does pay for my server though which costs $15 a month. YouTube gives you much more visibility. I'll try to compile the numbers from my single Carbon ad placement and the donations I receive from readers.
But I also don't think I have the process in place to do Blog, YouTube, Podcast and hold a full time job. Yes the job is my source of income.
Yeah I hear you. My understanding is that on youtube you can make ~2k per 1M views with the default ads. I'm hoping that I can be funded by some combination of that and something like patreon/membership/merch. But we will see, it's something I've wanted to do for years and I am getting too old to put off longer.
Im a firm believer that data collected that doesnt have a clear action associated with it is meaningless - and i couldnt think of an action i would take if my traffic goes up or down on my personal blog - but tbh i mainly blog for myself not really to build an audience, so our objectives might differ
There are some actions you can take. For example, when my traffic plummeted, I saw through my logs that search engines were trying to access my search page with questionable queries. That's when I realized I became a spam vector. I gave a better rundown through the link I shared.
Same reason why people have personal projects and share them on GitHub, it's fun to see people using / starring / interacting with your project / blog.
Just an FYI, the data collected to make those conclusion was through the server log (Apache2 in my case). So if you run your own server or VPS, you already have this information.
If you want to count every search engine bot, AI crawler, vulnerability scanner as users then that works, but these days it's basically useless to use these web server logs.
Great to see this here. I’ve been using EchoFeed (https://echofeed.app) to syndicate articles from my blog to social media (it uses RSS as the feed source). I also recently learned about POSSEparty (https://posseparty.com) which has more options but is self hosted.
RSS most certainly isn’t dead either. I run pagecord.com (indie blogging app) and the majority of traffic is from a huge variety of feed readers.
I've used Twitter to publish my thoughts for years. In the beginning, I'd write multiple threads to get my points across.
Then, when Twitter started supporting longer tweets, I started publishing essays and it got the job done.
But at the end of each year, it was really hard to trace all my posts and write reviews about them. That's exactly what brought me to POSSE. I've been maintaining my blog[1] since early 2020 and it feels really good to know that I own my stuff. Plus, over the years, it has opened up so many doors for me.
Too bad many of these walled-garden platforms have now started to demote posts if they contain external URLs. I'm battling that by posting the links as a comment to the original post, which contains a cover photo of the blog.
I'd like to have a POSSE setup for video with a landing page, a static image and transcript, a download button for very slowly downloading the video, metadata and links to instantly-available external copies so that I can channel as much of the server costs that video entails to the big platforms.
Has anybody written about adapting POSSE for videos?
POSSE Party (http://posseparty.com) supports syndicating YouTube Shorts and Instagram reels, but trying to syndicate longer form video just didn't make sense IMO
Q: While I agree strongly with the philosophy of this article, and twice I have set up static site generators for blogs hanging underneath my top level personal domain markwatson.com, each time I cause I could only blog when sitting at my computer, not when I was using an iPad or iPhone (I limit my daily time at a computer to just a few writing and coding sprints, otherwise I literally put my laptop away - out of sight out of mind).
Does anyone know of any mobile friendly static site generators?
I think I have about 3000 blog articles between Substack and Blogspot.
You could use Substack/Blogspot/Mastodon themselves as your "static site generator".
POSSE (the concept linked here) is overrepresented in relation to revealed preference. PESOS (publish elsewhere, syndicate on site) is more compatible with how most people (including nerds) actually use the Internet; for all the talk about static site generators and "owning" your own "digital garden" >9/10 people would fall somewhere on the embarrassing part of the curve from the "Blogging vs. Blog Setups" comic. <https://rakhim.org/honestly-undefined/19/>
If you migrated to a fediverse instance with longer post length limits, you could use that to actually blog/post while mobile, and meanwhile you have a script on your homepage that "lazily" syncs those posts to your static site—
When anyone visits your homepage, they see your site as it was when you last built it.
When you visit your own homepage, it automatically fetches your social media feed, patches the previous input to the SSG with the new content, and then uses the APIs of whatever you're using to host your site for rolling out the new posts.
If you set up the static site generator as a CI/CD action in you favorite git provider, can work with both hosted GitHub, GitLab, etc. or self hosted Forgejo [1], you have both version control for your blog as well as an automatic way of publishing.
Sure, the UX is not that great as with a dedicated interface like substack, but building a Hugo site is really just editing markdown files anyway, most mobile git enabled editors should be able to do that.
You can write your posts in Markdown, use Obsidian to sync them across devices, and render the pages in Quarto. This might not let you publish from mobile, but you can at least write them anywhere you want.
I take this approach with everything I post, though I only syndicate to Mastodon. I have an RSS and JSON feed for each of the content types (they all have different schema) on my site: posts, links, books, movies, concerts, status updates and a combined feed. I also maintain an ICS calendar subscription of upcoming album releases.
These items, in turn, can be optionally syndicated to Mastodon when published. For status updates, I have a field that supports Mastodon-specific text (for mentions and so forth).
I also expose an oembed endpoint that returns the appropriate data for each content type for platforms that support it.
Everything I read is from RSS feeds I follow via freshRSS. Links are saved to linkding and are transformed into TTS "podcasts" that are sent to audiobookshelf.
EchoFeed is a lovely service to enable this regardless of what service you use to publish on your own site (so long as it supports RSS/Atom/JSON). I've used it to good effect for my blog in the past.
However, I suffer from a lack of high-quality news sources, no matter whether they support RSS. They no longer publish online these days. And, realistically, I am not interested in most post from people I am interested in. So I just manually poll some times a month in my browser.
A related idea that I'd like to see more people do. If you have 10-20 tweets on a subject, plug the holes and turn them into an essay on the real internet. My first step in writing https://news.ycombinator.com/item?id=46452763 was to copy a bunch of tweets into a doc.
Micro blogging is a great way to brainstorm and iterate on your thoughts over time, but eventually you have enough material to graduate from micro blogging to blogging, and more people should do it.
I started POSSE microblogging. My website has an “etc.” section for tweet-like posts. It relieved the pressure to create HN-worthy posts every time. It also gives me a place to share art and links.
This parallels the learning paradigm of diving into some topic, write a blog post to solidify, practice, demonstrate your knowledge, and finally promote via social media. Which parallels the origin story of sharing your scientific research. It's The Way of knowledge on the internet.
I've noted here before a course from Arlington UT about this on Edx "Data, Analytics, and Learning" (2014).
Nice to have another way of describing this pattern of writing and publishing, even if it does have a funny name POSSE.
Very similar to what I’m building with opal editor. The “site” is static markdown which lives and is stored in your browser, css, images, markdown and html. You can keep it as is with markdown or compile to html. From there you can easily push to vercel GitHub cloudflare netlify. Cheating the server less bit of it by using CORS proxies
I just started building my own website today with Django. I’m doing it because I just enjoy doing it. Most of my work is in data and ML infrastructure and it is just killing me. Working on the front end has opened my mind to possibility and given me new inspiration.
I love hn and was inspired by all the devs who have their own site. I was drowning in work, but put the Django architecture together on vacation, started putting things together today and it’s been a blast.
I don’t enjoy social media and was thinking to posse intrinsically.
I appreciate this post and the authors perspective.
SSGs are good for static sites with no interactivity or feedback. If you want interactivity or feedback, someone (you or a 3rd party service provider) is going to have to run a server.
If you're running a server anyway, it seems trivial to serve content dynamically generated from markdown - all an SSG pipeline adds is more dependencies and stuff to break.
I know there's a fair few big nerd blogs powered by static sites, but when you really consider the full stack and frequency of work that's being done or the number of 3rd party external services they're having to depend on, they'd have been better by many metrics if the nerds had just written themselves a custom backend from the start.
I just wanted to learn how to create an enterprise grade web application. I read a book on Django last year and did a few tutorials and enjoyed it. I also deploy infra on gcp and it works well there. It cost about $60/month for baseline hosting with light traffic/storage. I will probably use it for an interface for some of my ml projects. I was also looking into dart/flutter a much steeper learning curve for me personally.
This is pretty much how I began developing websites too. Except it was 2001 instead of 2026. And it was ASP (the classic ASP that predates ASP.NET) instead of Python. And I had a Windows 98 machine in my dorm room with Personal Web Server (PWS) running on it instead of GCP.
It could easily have been a static website, but I happened to stumble across PWS, which came bundled with a default ASP website. That is how I got started. I replaced the default index.asp with my own and began building from there. A nice bonus of this approach was that the default website included a server-side guestbook application that stored comments in an MS Access database. Reading through its source code taught me server-side scripting. I used that newfound knowledge to write my own server-side applications.
Of course, this was a long time ago. That website still exists but today most of it is just a collection of static HTML files generated by a Common Lisp program I wrote for myself. The only parts that are not static are the guestbook and comment forms, which are implemented in CL using Hunchentoot.
I remember ASP (application service provider, before cloud became synonymous with hosting), you are making me nostalgic. Back then I was in sales, I was selling real time inventory control, CRM and point of sale systems distributed over Citrix Metaframe in a secure datacenter. Businesses were just starting to get broadband connections. I would have to take customers to the datacenter to motivate them to let us host their data. Eight years later, google bought the building for $1.8b and eventually bought adjacent buildings as well.
We are talking about different ASPs. I am referring to Active Server Pages (ASP), the server-side scripting language supported by Personal Web Server (PWS) and Internet Information Services (IIS) on Windows. It is similar to PHP Hypertext Processor (PHP) and Java Server Pages (JSP) but for the Windows world. I began developing websites with ASP. Over the years, I dabbled with CGI, PHP, JSP, Python, etc. before settling on Common Lisp as my preferred choice for server-side programming.
I agree. To be more clear, that $60 is an estimate for a small configuration and includes serverless infrastructure to process 500,000 requests per month, plus storage, including a 20gb sql database and 100gb of object storage to serve video and images. More ideal for an application. You run the app in a container and only get charged for the requests, the sql database is persistent, so that cost $20/month and object storage with egress is about $10/month.
Let me describe my setup, so that you can compare. I use a Contabo VPS for around 5 USD month to host my Wagtail (django-based) site. The DB also runs on the same infra and since it's SQLite I can back it up externally.
I probably wouldn't be able to handle 0.5M requests, but I am nowhere near getting them. If I start approaching such numbers I'll consider an upgrade.
Check out Wagtail if you'd like to have even more batteries included for your site, it was a delight building my site with it:
Thank you for sharing your setup, I will certainly examine it and compare a bit later. I know my setup is a bit over the top, but it is the easiest to learn, since I live in gcp everyday. I certainly don't expect the .5m traffic, but that is one of the lower tiers for cloud run, serverless execution service. This is just a poc to get my fingers dirty with the MVT pattern.
> Syndication can be done fully automatically by the server
At the risk of stating the obvious: this can get tricky, many popular social media platforms restrict automated posting. Policies around automation and/or api usage can change often and may not even be fully public as some might overlap anti spam measures.
Buffer documents a number of workflows and limitations in their FAQs.
E.g. for a non-professional Instagram account, the user gets a notification to manually share a post via the Instagram app.
> you can prepare your post in Buffer, receive a notification on your mobile device when it’s time to post, then tap the notification and copy your post over to the social network to finish posting.
Postiz docs show that users can create an app on Facebook and use that key and it will auto post.
I guess using POSSE for Instagram forces you to either create a personal app on Facebook which is not easy or make your Instagram account a business account.
Havent been able to figure this out for Instagram - also the only social media that is still relevant for me.
(thankfully?) never got into twitter where it seems to be easy.
I'd prefer to write markdown, publish to my static site, and cross-post to social media. I imagine I'd also want to get an overview of - or make an ad-hoc post from - one of several accounts on one of several platforms.
I came across Posse Party and Postiz, both of which are self-hosted. It doesn't seem like either is built for this use case.
Never stopped doing it. And my resolve strengthened when most blogs started doing summary feeds to force people to visit —- I kept doing full text feeds as a matter of course, and if it wasn’t for the Twittergeddon, I would still be automatically posting to Twitter (now I do it to Mastodon - https://mastodon.social/@taoofmac)
This is pertinent. I’ll get hundreds of page views a day on a blog post and if that’s syndicated to X it’ll get 55 views, never to be seen again despite having 1200 followers.
Focus on publishing your own work. Syndicate if it’s effortless, otherwise don’t worry about it.
That's almost a job in itself because you have to constantly make sure not to get shadowbanned. This is probably only an option for people who already use "social media" sites in the first place.
Putting a link to your site in forum signatures was the way to go. Unfortunately, forums are 99% dead.
I made an account on twitter/x specifically to promote my site at some point. I was shadowbanned by default and had to follow other people, like and share their posts and comment on other's post just to get my own posts to show up on people's feeds. When I checked again at some later point I was shadowbanned again.
When I tried reddit I also noticed that I was shadowbanned by default and didn't even bother to do anything about because I assumed it would turn out the same way. Like I said you can use those sides to get the word out, but only if you're actively using them as a user to begin with.
The point of POSSE isn't just to blast every piece of crap you do everywhere with an automated system - those should be (shadow)banned
Think of it more like oldschool blog replies. Instead of replying with a 1000 word twitter message, post your answer on your blog and reply with a summary + link to your site instead.
But ALL social media sites downrank posts with links, that's why the "link in comments" shit is so common... They do not want you leaving their algorithmic feed to read stuff elsewhere.
I also thought this and didn’t get the point.
But the link was published 2013 and maybe for users not used to personal blogs but only social media nowadays it’s worth mentioning…
What blog systems (either self hosted or easy to move) do folks recommend nowadays? I'm not interested in spending much time tinkering and updating, but have enough sysadmin experience to host one myself. Was last using blogger, though trying to de-googlify my life slowly
A single-binary static site generator would be my approach now. You can trust it to run in a few years.
I wrote my own SSG because I operate a website for a living and had specific needs. Prior to that I ran Craft CMS on the professional website and Wordpress on the personal one.
The benefit of SSGs is that the technical effort is tied to publishing. Once it’s online it stays online. You have both the human-readable source content and the static site. With traditional CMS there is a constant effort required to keep the website running. My dockerized Craft website wouldn’t start on the first few tries after a year offline.
SSGs are fantastic for building long-lasting websites with a low maintenance burden.
Definitely recommend any Static Site Generator like Jekyll, Hugo, Eleventy, Astro, rolling your own, etc. it’s easy to deploy the resulting bundle on various hosting services and set up builds on git pushes.
Can I shamelessly self-promote Pagecord? Free plan plus super-cheap premium plan with loads of features. Also source available so self-hostable (arguably cheaper to let me do it for you!). Export in full HTML or static-site friendly Markdown so you’re not trapped.
I’ve been an Astro user for the last two years and I can’t recommend it. RSS support isn’t native, doesn’t support anything other than plain markdown, and all of the extra magic of MDX and their custom .Astro templates is wasted without feed syndication.
My big project for sometime this year is to switch to Eleventy.
I think webmentions would be the answer there, though I don't think Bluesky currently supports it so you'd be looking at scraping rather than a push-based model.
you can just post a message on bluesky and link it to your blog post at the end with some text saying "want to comment? Reply to this post."
Personally, I prefer using webmentions. I got them back to my site using a combination of services, so if someone talk about a post I made in bluesky or mastodon, I usually get a webmention back to the post.
I really want to implement this, but i havent been able to figure out how to do it for Instagram (the only social media that is really relevant in my friend circle) and whatsapp/signal groups other than doing it manually.
If anyone has tips, especially for Insta let me know...
Posse what? You can literally publish on your own site and syndicate elsewhere using anything whatsoever, including typing .html files into /var/www, if that's your thing.
Feeder on Android is my pick.
Thunderbird does RSS, and I already use it for email, so it's a nice all in one on Linux.
Both can use OPML files to import/export your feeds.
The concept is sound, but the syndicate part is becoming increasingly hostile to maintain. I used to have scripts that auto-posted to twitter, facebook and reddit. Over the last two years, almost all of those broke due to API paywalls or aggressive bot detection.
I've found that "POSSE" is shifting more toward "Publish on Own Site, Manually Link Elsewhere."
Paradoxically, ActivityPub (mastodon/fediverse) is the only place where true automated syndication is still reliable. I think the future of POSSE isn't trying to hack together API keys for walled gardens, but treating your personal site as a fedi instance so the syndication is native.
> You can't just output a feed of posts and be done (I tried) - so even if you are a statically generated site you need a Server component ... My implementation uses Hugo to create my posts and feed data, Vercel Serverless functions to handle in bound messages, and Firebase Firestore to store the data.
It’s also a lot less wasteful and more respectful in my opinion. Scattering content everywhere in the hope others will see it doesn’t feel right to me.
It’s almost like HN is a great platform for the POSSE model!
Awesome share thanks for the link. Will send to a family member who is looking to gain viewership with their writing - they usually post on medium I think.
This is my approach and I fully recommend it. My personal website is my canonical home address on the web. It has outlived a few platforms and many rounds of enshittification.
A few caveats:
- You will have different communities on each social network. Your personal website might be home to you, but to your users, it's not. You're just another creator on their platform of choice.
- Each community has its own vibe, and commands slightly different messaging. This is partly due to the format each platform allows. Each post will create parallel but different conversations.
- Dumping links is frowned upon. You should be a genuine participant in each community, even if you just repost the same stuff. Automation does not help much there.
- RSS and newsletters are the only audiences that you control, and they're worth growing. Everywhere else, people who explicitly want to follow you might never see your updates.
- You should own the domain you post to. This is your address on the internet, and it should stay yours
- People do check your personal website. I was surprised to hear friends and acquaintances refer to things I post on my website.
I've found that managing the conversations across venues is way harder than publishing to them, and POSSE doesn't address this except for one line about backfeeds/"reverse syndication", which most mainstream services either don't support or actively sabotage. It easily takes more effort to engage across services than to post across them.
How do you fellow HN'ers separate their online with their corporate identity and day job?
I cannot rid myself of the suspicion that your average boss is going to have a prying eye on your online activities and may even use them against you one way or another e.g. if you offer services/work on side projects that may in any way may compete w/ your employer.
I only work for small companies that don’t have any business interest in areas where I want to maintain independent side projects.
When a startup I was working at made a successful exit and got acquired my a major corporation that did have business interests which overlapped with my side projects, I refused the bonus contract and froze my side project activity until leaving about a year later.
Don’t get cute. Avoid side projects that compete with your employer, and disclose unrelated side projects properly so that your employer is forced to acknowledge them. Do what it takes to avoid entanglement, making sacrifices if necessary.
IDK, your average boss is just a dude who has bills to pay and mouths to feed. They don't really care what happens as long as you're not doing something stupid, especially visibly and on their time.
My experience is that bosses read my blog, then when they or a fellow manager need to hire someone, have reached out to me asking me to apply. So it cuts both ways - maybe your shitty boss sees you blogging and sharing your experience, but a good boss will see that and go "I want this passionate and curious person to work for me".
I've been doing this for years with my site, and it's brought me a lot of joy that I can go back and search my site for various posts I've made over the last decade across all the platforms I use - I have a more high friction setup, but that's because of my own terrible choices
Receiving webmentions can be as simple as making a custom log in nginx and just having it save all POST to the webmention URL endpoint. I love it. And sending can be done with curl or whatever you want (ie html forms without JS).
Or, you can use any of the many community projects which handle all this backend stuff and provide it as a service.
Either extreme works. I love the indieweb set of protocols for this. Other things like ActivityPub require active interaction for the cryptographic handshake at a minimum and make simple solutions infeasible despite other benefits. Indieweb can be as complex or as simple as you want.
This post like many recent ones like it, essentially wants the internet to go backwards to what it once was pre-LLMs [edit: and pre-concentration]. I'd like to suggest that you should follow through and go all the way to pre-internet itself, and rediscover handwriting, in-person local meeting groups, non-digital relationships, and using your hands not on a keyboard. Today I (with difficulty) left my macbook closed all day until this evening (and this comment). Small steps.
I understand this attitude but when I look back at my rural youth I just hear you telling me that I should have had no one to talk to at all about many things.
POSSE can be applied to more than just social networks, it can be used to disrupt every marketplace!
In fact, I’m building open source SaaS for every vertical and leveraging that to build an interoperable, decentralized marketplace.
Social media is a marketplace as well. The good being sold is people’s content and the cost you pay is with your attention. The marketplace’s cut is ads and selling your data.
From the referer (sic) data in my web server logs (which is not completely reliable but still offers some insight), the three largest sources of traffic to my website are:
1. RSS feeds - People using RSS aggregator services as well as local RSS reader tools.
2. Newsletters - I was surprised to discover just how many tech newsletters there are on the Web and how active their user bases are. Once in a while, a newsletter picks up one of my silly or quirky posts, which then brings a large number of visits from its followers.
3. Search engines - Traffic from Google, DuckDuckGo, Bing and similar search engines. This is usually for specific tools, games and HOWTO posts available on my website that some visitors tend to return to repeatedly.
[1] https://susam.net/from-web-feed-to-186850-hits.html
Not having CORS set up for your RSS feed means that browser-based feed readers won't be able to fetch your feed to parse it (without running a proxy).
1. <https://enable-cors.org/>
If you want to get a red line, you need to use red ink. If you use blue ink, you'll get blue lines. And I can draw you cat. (I'm no artist, but I can give it a try.) But it it won't be a line anymore. A line and a cat: those are two different things.
Does a Web site practically need to do anything to advertise their feed to the diehard RSS/Atom users, other than use the `link` element?
Is there a worthwhile convention for advertising RSS/Atom visually in the page, too?
(On one site, I tried adding an "RSS" icon, linking to the Atom feed XML, alongside all the usual awful social media site icons. But then I removed it, because I was afraid it would confuse visitors who weren't very Web savvy, and maybe get their browser displaying XML or showing them an error message about the MIME content type.)
[1]: https://www.rss.style/
[1] https://tonysull.co/articles/mcp-is-the-wrong-answer/
Browsers also never added support for some of the most fundamental features to support XSLT. Page transitions and loading state are particularly rough in XSLT in my experience.
One bonus is that it will be easier to customize for people that know JavaScript but don't know XSLT (which is a lot of people, including me).
You'll still need to add a line to the feed source code.
There's no real reason to take this position. A styled XML document is just another page.
For example, if you're using a static site generator where the front page of your /blog.html shows the most recent N posts, and the /blog/feed.xml shows the most recent N posts, then...?
I don't use it myself because my computer is too slow (I think they built it in node.js or something). But it makes me happy that someone is carrying the torch forward...
[1]: https://rednafi.com
PS: I found out I was already subscribed to your feed.
Yeah, the idea was that since RSS is still considered niche by the broader audience, those who are looking for it will probably find it just fine.
Also, your Segal's Law link seems to have an encoding issue with the apostrophe.
Also, weird, seems like I don't see the encoding issue on the segal's law.
But for a commercial marketing site that must be on the awful social media, I'm wondering about quietly supporting RSS/Atom without compromising the experience for the masses.
Here's an oldie but a goodie regarding RSS vs Atom [2]
[1]: https://ittavern.com/difference-between-rss-and-atom/
[2]: http://www.intertwingly.net/wiki/pie/Rss20AndAtom10Compared
You should just use Atom.
Readers come with some nice bonus features, too. All of them have style normalization for example and native reader apps support offline reading.
If only there were purpose-built open standards and client apps for other types of web content…
It’s by far the best I’ve tried. Most other macOS readers aren’t memory managing their webviews properly which leads to really bad memory leaks when they’re open for long periods.
https://github.com/martinrotter/rssguard
I have never seen RSS clients or crawlers preload actual HTML pages. I've only seen them fetching the XML feed and present its contents to the users.
When I talk about visitors arriving at my website from RSS feeds, I am not counting requests from feed aggregators or readers identified by their 'User-Agent' strings. Those are just software tools fetching the XML feed. I'm not talking about them. What I am referring to are visits to HTML pages on my website where the 'Referer' header indicates that the client came from an RSS aggregator service or feed reader.
It is entirely possible that many more people read my posts directly in their feed readers without ever visiting my site, and I will never be aware of them, as it should be. For the subset of readers who do click through from their feed reader and land on my website, those visits are recorded in my web server logs. My conclusions are based on that data.
Some setups like ttrss with the mercury plugin will do that to restore full articles to the feed, but its either on-demand or manually enabled per feed. Personally I dont run it on many other than a few more commercial platforms that heavily limit their feed's default contents.
Presumably some the more app based rss readers have such a feature, but I wouldnt know for certain.
https://github.com/rumca-js/Internet-feeds
I only occasionally look at the HTTP 'Referer' header in my web server logs and filter them, out of curiosity. That is where I find that a large portion of my daily traffic comes via RSS feeds. For example, if the 'Referer' header indicates that the client landed on my website from, say, <https://www.inoreader.com/>, then that is a good indication that the client found my new post via the RSS feed shown in their feed aggregator account (Inoreader in this example).
Also, if the logs show that a client IP address with the 'User-Agent' header set to something like 'Emacs Elfeed 3.4.2' fetches my '/feed.xml' and then the same client IP address later visits a new post on my website, that is a good indication that the client found my new post in their local feed reader (Elfeed in this example).
Like:
- https://nownownow.com
- https://defaults.rknight.me
- https://aboutideasnow.com
- https://chrisburnell.github.io/interests-directory/
- https://bukmark.club/directory/
- https://uses.tech
I find browsing and discovering fun. So, after years of lurking I decided to make my own directory. It is called Top Four (https://topfour.net).
A /top4 page is a personal webpage where you can share your definitive ranked list of your top 3 favorites and an honorable mention. In a specific topic, such as movies, albums, snacks, games, or anything else you feel strongly about. Or read the announcement: https://peterspath.net/blog/project-top-four/
- we trained the community around us to look to our website first for the most recent news and information
- we did not want a social media platform to be able to cut us off from our community (on purpose or accident) by shuttering accounts or groups
- we did not want to require our users have accounts on any 3rd party platforms in order to access our postings
- but we still wanted to distribute our messaging across any platforms where large groups of our community members frequently engaged
Another aspect of our process that was specific to our situation and outside of POSSE - we only posted one topic/issue/announcement per blog post. We had a news letter that would summarize each of these. Many organizations like ours would post summaries of many things to a single blog post, basically the same as the newsletter. However, this was cumbersome. For example, if someone in the community had a question, it was much clearer to link to a single post on our site that answered the question AND ONLY answered that question. It made for much better community engagement, better search engine indexing, cleaner content management, and just a better experience for everyone involved.
1000x yes to this! It can be really frustrating when a link takes me to FB, TW, IG, etc. - none of which I use.
Facebook removed that feature. The effect of this was that people had to create content within facebook instead of outside it. This reoriented the flow of content creation so that it must originate inside of Facebook, removing the ability to use FB as a passive consumer of content created in a workflow where the creators chose the entire flow.
IMHO this is one of the biggest steps down ever in FB history. It was one of the biggest attacks on the open web, and I'm sad to say that it mostly worked, and the internet at large is worse as a result.
He was always like this and never intended to create something actually valuable.
People use it to keep in contact with relatives and friends, I follow work groups, my mother took COPD therapy through Facebook and chatted with relatives in other countries. I think Hacker News has been so radicalized against social media and "algorithms" that they forget most people's relationship with social media is entirely mundane.
I refer to the video metrics scandal. How many video autoplay and other things has everyone felt obliged to copy because Zuckerberg (who seems to care about nobody) made FB into a fradulent company?
I guess that's why Discord is also locked down as much. They have community content that is inaccessible anywhere else but Discord.
Snapchat features are blood money, they also result in less people using Snapchat
Nobody says Zuck doesn't earn a lot of money but a lot of it is likely fraudulent and he's just not a very good person
He's a POS, that's also why POSSE is good. ;)
Did he/didn't he steal? Dunno, though there's a fair few bits of evidence in the various lawsuits (Winklevoss, Greenspan)
But if you didn't know about any of that you could make some inferences. Like his neverending "ooh, shiny new thing, want" (and then lie to people along the way, trick Indians into signing up for your internet.org thing)
I was willing to take his side on a few things because the political situation is genuinely unclear and the public has been misled but then right after he wrote the open letter to Jim Jordan about censorship coercion (which was a real problem, I want to see more tech companies talking about it) he does this https://news.ycombinator.com/item?id=42651178
It's kind of like he has raw bits of intelligence but doesn't quite know how to piece it together and besides "his" inventions (even FB) are put together (largely?!) by other people
Yeah, that trying to trick Indians into that thing (IIRC, it was also called Free Basics or something like, to sound attractive, prolly) became a big issue in India at the time, I remember, although I didn't delve deep into the matter. I think a group of leading Indian freedom activists took on FB in the media and petitioned the government, and it resulted in the whole scheme collapsing.
This is what I was referring to in my earlier comment: "he's just not a very good person".
Like, all engineers are saints and the other side are all sinners. What crap. Get real, guys. There are all kinds of multicolored and multidimensional people.
Having been on all 3 or 4 or 5 of these sides :) (dev, sysadm, manager, consultant, ...), I have seen that.
Grow up, folks, and enjoy life in all its richness.
.
A lot of people have been following indieweb POSSE principles for almost 15 years: publishing on their own site and syndicating elsewhere. I built my own platform for it that I used for 11 years, but you can use just about anything.
What's superb about the indieweb principles is that they're as simple as the web itself. It's worth digging into microformats, webmention, micropub, and the other lightweight standards the community has nurtured. It's all really good work that will become even more useful as more people turn away from centralized social media and AI-saturated services towards human websites. The indieweb is a slow burn but a really vibrant, growing, human community.
[0]: https://indieweb.org/PESOS
I really need to automate it though - hard on Twitter and LinkedIn but still pretty easy for Bluesky and Mastodon.
But alas, Facebook pushed forward too fast to counter.
There's still a chance, but the software needs to focus on simplicity and ease of use. Publishing blobs of signed content that can be added to anything - HTML pages, P2P protocols, embedded into emails and tweets - maybe we can hijack the current systems and have distributed identity and publishing take over.
The truth is that we’re social creatures and for social products, that means hanging out where other friends are already hanging out. It’s my personal thesis that no matter how matter how much we lower the bar to participate in the indieweb, fediverse, or other non-corporate platforms, it’s going to be inherently niche.
Which is fine. Small is beautiful.
Then, if there is a viable alternative to big social media, my thesis is that there might come a day when a critical mass has been fed up and finds a viable alternative that's still beautiful but no longer small.
I don’t know about the rest of big social media switching away, so I’m personally just focused on appreciating the community that’s been built up already instead of evangelizing. Maybe I’m wrong and something open will go viral, like the new Loops video app.
While I don’t follow nor am I necessarily interested in everything that you cover, I do appreciate the presence of having something like a local “correspondent” around when you do appear to provide trails of supplementary commentary. The lengths that I see you go through to do all of this tastefully and transparently are not unnoticed.
I figure if you chose to follow me on Bluesky/Twitter/Mastodon/LinkedIn there's no ethical issue at all with me automating the process of having my new blog entries show up in my feeds there, as opposed to copying-and-pasting the links by hand.
To tell you the truth I came to this actual submission to express my apathy toward the ‘POSSE’ concept but I saw you here and figured that I could somehow voice that feeling while simultaneously making mention of a sharing method that I do find worthwhile and more personable. And not an easy thing to pull off.
How much of your traffic comes from HN as opposed to the other platforms?
I think it's more accurate to see blogging as a distinct channel from other types of social media + content marketing
Follow-up comments and engaging with others after posting is big too. People that “syndicate” without actually engaging on each platform are like some weird proselytizers that show up to a house party and hand out flyers to their own weird shindig without talking to anyone there.
The general idea for me is that I crosspost short messages about what I am current working on, but the actual finished product is self-hosted. Deleting any of the accounts will not result in lost information.
I published a write up just this morning: https://idiallo.com/blog/what-its-like-blogging-in-2025
I don't think this is correct unless you mean strictly the number of HTTP requests to your web server.
You were the 9th most popular blogger on HN in 2025.[0] Your post says you have about 500 readers via RSS. How can that represent more readers than people who read your posts through HN? I'd guess HN brought you about 1M visitors in 2025 based on the number of your front page posts.
[0] https://refactoringenglish.com/tools/hn-popularity/domain/?d...
However, users can click on an RSS feed article and read it directly on my blog. These have a URL param that tells me they are coming from the feed. When an article isn't on HNs frontpage, the majority of traffic is coming from those feeds.
By the way, thank you for sharing this tool. Very insightful.
I'm planning to leave my job this year and focus on content, mostly have been considering YouTube, but if blogging can work too, might consider that as well
But I also don't think I have the process in place to do Blog, YouTube, Podcast and hold a full time job. Yes the job is my source of income.
At some point you're getting sponsors to pay for it and then it get complicated.
I want to add analytics to my blog too, haven't had any on my sites for about a decade.
Im a firm believer that data collected that doesnt have a clear action associated with it is meaningless - and i couldnt think of an action i would take if my traffic goes up or down on my personal blog - but tbh i mainly blog for myself not really to build an audience, so our objectives might differ
RSS most certainly isn’t dead either. I run pagecord.com (indie blogging app) and the majority of traffic is from a huge variety of feed readers.
Then, when Twitter started supporting longer tweets, I started publishing essays and it got the job done.
But at the end of each year, it was really hard to trace all my posts and write reviews about them. That's exactly what brought me to POSSE. I've been maintaining my blog[1] since early 2020 and it feels really good to know that I own my stuff. Plus, over the years, it has opened up so many doors for me.
Too bad many of these walled-garden platforms have now started to demote posts if they contain external URLs. I'm battling that by posting the links as a comment to the original post, which contains a cover photo of the blog.
[1]: https://rednafi.com
Has anybody written about adapting POSSE for videos?
Does anyone know of any mobile friendly static site generators?
I think I have about 3000 blog articles between Substack and Blogspot.
POSSE (the concept linked here) is overrepresented in relation to revealed preference. PESOS (publish elsewhere, syndicate on site) is more compatible with how most people (including nerds) actually use the Internet; for all the talk about static site generators and "owning" your own "digital garden" >9/10 people would fall somewhere on the embarrassing part of the curve from the "Blogging vs. Blog Setups" comic. <https://rakhim.org/honestly-undefined/19/>
If you migrated to a fediverse instance with longer post length limits, you could use that to actually blog/post while mobile, and meanwhile you have a script on your homepage that "lazily" syncs those posts to your static site—
When anyone visits your homepage, they see your site as it was when you last built it.
When you visit your own homepage, it automatically fetches your social media feed, patches the previous input to the SSG with the new content, and then uses the APIs of whatever you're using to host your site for rolling out the new posts.
Sure, the UX is not that great as with a dedicated interface like substack, but building a Hugo site is really just editing markdown files anyway, most mobile git enabled editors should be able to do that.
[1]: https://home.futuretim.io/posts/hugo_build_and_post/
These items, in turn, can be optionally syndicated to Mastodon when published. For status updates, I have a field that supports Mastodon-specific text (for mentions and so forth).
I also expose an oembed endpoint that returns the appropriate data for each content type for platforms that support it.
Everything I read is from RSS feeds I follow via freshRSS. Links are saved to linkding and are transformed into TTS "podcasts" that are sent to audiobookshelf.
https://echofeed.app/
However, I suffer from a lack of high-quality news sources, no matter whether they support RSS. They no longer publish online these days. And, realistically, I am not interested in most post from people I am interested in. So I just manually poll some times a month in my browser.
Somewhat related, predictions for the future of the web by IWC contritbutors:
https://vhbelvadi.com/indieweb-carnival-round-up-dec-2025
Micro blogging is a great way to brainstorm and iterate on your thoughts over time, but eventually you have enough material to graduate from micro blogging to blogging, and more people should do it.
I've noted here before a course from Arlington UT about this on Edx "Data, Analytics, and Learning" (2014).
Nice to have another way of describing this pattern of writing and publishing, even if it does have a funny name POSSE.
https://news.ycombinator.com/item?id=46015121 https://www.pure.ed.ac.uk/ws/portalfiles/portal/19117279/CSC...
https://opaledx.com
https://github.com/rbbydotdev/opal
MIT and open source no documentation yet. But coming very soon
I love hn and was inspired by all the devs who have their own site. I was drowning in work, but put the Django architecture together on vacation, started putting things together today and it’s been a blast.
I don’t enjoy social media and was thinking to posse intrinsically.
I appreciate this post and the authors perspective.
If you're running a server anyway, it seems trivial to serve content dynamically generated from markdown - all an SSG pipeline adds is more dependencies and stuff to break.
I know there's a fair few big nerd blogs powered by static sites, but when you really consider the full stack and frequency of work that's being done or the number of 3rd party external services they're having to depend on, they'd have been better by many metrics if the nerds had just written themselves a custom backend from the start.
It could easily have been a static website, but I happened to stumble across PWS, which came bundled with a default ASP website. That is how I got started. I replaced the default index.asp with my own and began building from there. A nice bonus of this approach was that the default website included a server-side guestbook application that stored comments in an MS Access database. Reading through its source code taught me server-side scripting. I used that newfound knowledge to write my own server-side applications.
Of course, this was a long time ago. That website still exists but today most of it is just a collection of static HTML files generated by a Common Lisp program I wrote for myself. The only parts that are not static are the guestbook and comment forms, which are implemented in CL using Hunchentoot.
I probably wouldn't be able to handle 0.5M requests, but I am nowhere near getting them. If I start approaching such numbers I'll consider an upgrade.
Check out Wagtail if you'd like to have even more batteries included for your site, it was a delight building my site with it:
https://blog.miloslavhomer.cz/hello-wagtail/
I'd still recommend starting with SQLite, seems that by skipping a DB service you can save quite a few bucks.
Ask HN: Is starting a personal blog still worth it in the age of AI?
https://news.ycombinator.com/item?id=46268055
A website to destroy all websites
https://news.ycombinator.com/item?id=46457784
At the risk of stating the obvious: this can get tricky, many popular social media platforms restrict automated posting. Policies around automation and/or api usage can change often and may not even be fully public as some might overlap anti spam measures.
0. https://buffer.com
1. https://github.com/gitroomhq/postiz-app
Buffer documents a number of workflows and limitations in their FAQs.
E.g. for a non-professional Instagram account, the user gets a notification to manually share a post via the Instagram app.
> you can prepare your post in Buffer, receive a notification on your mobile device when it’s time to post, then tap the notification and copy your post over to the social network to finish posting.
source: https://support.buffer.com/article/658-using-notification-pu...
I guess using POSSE for Instagram forces you to either create a personal app on Facebook which is not easy or make your Instagram account a business account.
I came across Posse Party and Postiz, both of which are self-hosted. It doesn't seem like either is built for this use case.
Which direction would you go in?
Focus on publishing your own work. Syndicate if it’s effortless, otherwise don’t worry about it.
Blogging lives! :)
[1] https://maknee.github.io/blog/
That's almost a job in itself because you have to constantly make sure not to get shadowbanned. This is probably only an option for people who already use "social media" sites in the first place. Putting a link to your site in forum signatures was the way to go. Unfortunately, forums are 99% dead.
When I tried reddit I also noticed that I was shadowbanned by default and didn't even bother to do anything about because I assumed it would turn out the same way. Like I said you can use those sides to get the word out, but only if you're actively using them as a user to begin with.
Think of it more like oldschool blog replies. Instead of replying with a 1000 word twitter message, post your answer on your blog and reply with a summary + link to your site instead.
But ALL social media sites downrank posts with links, that's why the "link in comments" shit is so common... They do not want you leaving their algorithmic feed to read stuff elsewhere.
They wouldn't even need to host them anywhere, just have the same text on a device they own and control.
So fucking much is lost when a FB group just suddenly disappears along with all of the time people have spent writing on them.
I wrote my own SSG because I operate a website for a living and had specific needs. Prior to that I ran Craft CMS on the professional website and Wordpress on the personal one.
The benefit of SSGs is that the technical effort is tied to publishing. Once it’s online it stays online. You have both the human-readable source content and the static site. With traditional CMS there is a constant effort required to keep the website running. My dockerized Craft website wouldn’t start on the first few tries after a year offline.
SSGs are fantastic for building long-lasting websites with a low maintenance burden.
https://github.com/lylo/pagecord
My big project for sometime this year is to switch to Eleventy.
https://github.com/searlsco/posse_party/blob/main/LICENSE.tx...
I guess it could just be done as a multi phase post, as janky as that is.
Personally, I prefer using webmentions. I got them back to my site using a combination of services, so if someone talk about a post I made in bluesky or mastodon, I usually get a webmention back to the post.
https://news.ycombinator.com/item?id=46482285
But this is no longer available.
You have to copy and paste the article into Medium manually unfortunately.
I've found that "POSSE" is shifting more toward "Publish on Own Site, Manually Link Elsewhere."
Paradoxically, ActivityPub (mastodon/fediverse) is the only place where true automated syndication is still reliable. I think the future of POSSE isn't trying to hack together API keys for walled gardens, but treating your personal site as a fedi instance so the syndication is native.
For example: This is rss for Simon Willison: https://bsky.app/profile/simonwillison.net/rss
Awesome share thanks for the link. Will send to a family member who is looking to gain viewership with their writing - they usually post on medium I think.
A few caveats:
- You will have different communities on each social network. Your personal website might be home to you, but to your users, it's not. You're just another creator on their platform of choice.
- Each community has its own vibe, and commands slightly different messaging. This is partly due to the format each platform allows. Each post will create parallel but different conversations.
- Dumping links is frowned upon. You should be a genuine participant in each community, even if you just repost the same stuff. Automation does not help much there.
- RSS and newsletters are the only audiences that you control, and they're worth growing. Everywhere else, people who explicitly want to follow you might never see your updates.
- You should own the domain you post to. This is your address on the internet, and it should stay yours
- People do check your personal website. I was surprised to hear friends and acquaintances refer to things I post on my website.
I cannot rid myself of the suspicion that your average boss is going to have a prying eye on your online activities and may even use them against you one way or another e.g. if you offer services/work on side projects that may in any way may compete w/ your employer.
Anyone got experience to share in that regard?
Thinking about this famous precedent: https://news.ycombinator.com/item?id=27424195#27425041
When a startup I was working at made a successful exit and got acquired my a major corporation that did have business interests which overlapped with my side projects, I refused the bonus contract and froze my side project activity until leaving about a year later.
Don’t get cute. Avoid side projects that compete with your employer, and disclose unrelated side projects properly so that your employer is forced to acknowledge them. Do what it takes to avoid entanglement, making sacrifices if necessary.
Or, you can use any of the many community projects which handle all this backend stuff and provide it as a service.
Either extreme works. I love the indieweb set of protocols for this. Other things like ActivityPub require active interaction for the cryptographic handshake at a minimum and make simple solutions infeasible despite other benefits. Indieweb can be as complex or as simple as you want.
But it's not an either/or proposition.
Also, keep in mind that POSSE and this site predate LLMs by quite a bit.
Some people are just using it to post more garbage into the platforms they already were using
Let's not throw out the baby with the bathwater
In fact, I’m building open source SaaS for every vertical and leveraging that to build an interoperable, decentralized marketplace.
Social media is a marketplace as well. The good being sold is people’s content and the cost you pay is with your attention. The marketplace’s cut is ads and selling your data.