The Man in Seat 61 is one of these sites, except it's still continually updated and is probably the single best location for train travel information, anywhere.
Well I've always wanted to do the VIA Rail Canadian trip from Toronto to Vancouver via Jasper. Now the page on that one has convinced me I want to do the most expensive option, too. That "Prestige" class... not too shabby.
Virtually anyone who is interested in bicycles has at some point landed on Sheldon Brown's site. Bless whoever maintains it and Rest in Peace, Sheldon.
The title has now been editorialized (fairly enough), but I posted this as an encouragement for all the great "classic" websites. I love seeing others people are posting here.
Working right now with a static site generator for my own website (Pelican) and been loving it.
I was thinking about this last night, and was going to make an ASK HN about it. How do we build a site to help explore these parts of the web?
I know it would get gamed really, but what if there were an injected "related-sites" link list as an extension, that drew from a curated directory? That was how you used to explore, through blog and site "friends of" sections that sent you on an endless rabbit hole.
I feel like it can't be feed oriented, because that's a more consumption mindset than exploration mindset, and I feel part of the fun of the old web was that constant active exploration.
I don't want to turn this into a conversation about SEO, but suffice to say commerce dominates search results for so many terms that it can be hard to find anything community oriented, or niche information. So search engines and feed sites are not quite right for exploration.
Yes, this existed and was popular for some time in the early 2000's. Stumble Upon was a tool you could use to explore random websites. Some were related by topic. It was easy to spend several hours exploring new or hardly viewed websites.
Reddit.com got started as the same kind of service, even though today it has clearly outgrown that use case altogether and become something like a new Compuserve BBS of sorts.
Other than that, it's a good question. It's been explored by many that Google tends to forget old content. Alternative search engines, such as DuckDuckGo, in my experience, tend to lean towards indie sites more.
> I know it would get gamed really, but what if there were an injected "related-sites" link list as an extension, that drew from a curated directory?
A federated curated directory, perhaps with an emergent shared vocabulary for tags/categories/etc. (similar to the Wikipedia-centered naming guidelines for wiki pages) might work. The DMOZ model wouldn't scale in the modern web, with its rampant contentiousness around politics etc., and even the original DMOZ broke down because of corruption, with websites having to pay category "editors" for listings.
Does anyone remember the yahoo home page from the mid 90’s? It would link on the front page to categories of the most interesting pages on the internet. Like all dozens of them. It was great.
There was even a tip level directory page of unusual devices connected to the internet. Like there was one single fishtank, one coffee maker and one internet connected vending machine... and everyone thought these things were marvelous and delightful.
Hmm, don't want to be rude, but there is nothing classic about that website. Actually it's quite modern, and maybe I'm too old to consider site from 2008 as classic. :)
Sometimes it would be very good to have an option to tell why the site was posted. In this particular case, adding a completely new the title made a lot of sense. Merely posting a site on waterfalls doesn't really fall into the Hacker News world, but the reason - namely the old web simplicity of it - does.
That sort of comment doesn't get lost unless the thread becomes huge, in which case it's probably less relevant.
Allowing commentary by the submitter to be pinned to the top would break HN's rule that submitting an article confers no special rights to interpret it for the reader. That's a bedrock rule here. It has served super well over the years.
Classic, in this context, is a way how the site works, navigates, looks. It's not necessarily classic by age.
I do like the idea of neo-classicist websites though. The "brutalist" websites a few years ago were simply ugly, and I with to avoid the ~2005 era pastel-on-pastel trend to emerge again. (I'd love a neo-gothic website trend, with dark by default though.)
By classic I meant to imply largely html+css and no glut of ads, pop ups, tracking, and script based features that bring load times to 10 seconds plus. I propose that this type of site is what we should be aiming more for in our web experience.
Of course I get your point, my all time best design website is Craigslist, so I'm on the same boat. Would love to see internet as plain html with minimum styles. :)
Yet, world is full of designers and developers who want to achieve something more than plain website. For ordinary Joe it doesn't matter at all if Craigslist is ugly or Facebook, they just use it. But creators want to think users admire good design and user experience. :)
Unfortunately the Hamilton waterfalls have been so overrun by GTAers (probably due to Instagram) that the city has cut back access, including banning people from Albion Falls.
Social media has lead to a lot of natural beauty being damaged because word got out to the kind of people who don't respect trails, graffiti, take rocks and plants, and litter. It's a shame. There's a beautiful little gorge near my hometown that was amazing until a few years ago when people trashed it.
I love these sites. They contain an extensive amount of information, all in a catalogue that reflects how the creator thinks. My personal all time favourite is http://www.mir.com.my/rb/photography/
There's no black magic in that at all: serve a static HTML page with small images on modern hardware. Suddenly it becomes visible that our connections are not 128Kbps ADSL any more, and that the bottleneck is actually JavaScript processing, not the network.
I remember last time there was a discussion about this style of websites, someone posted a link to a static site that showed what kind of amazing things you could do with a <map> tag[0], but I forgot to bookmark it.
I don't get it, is "classic" bad? My page is online since 1995 and is still the best way I share my work. There surely are many "classic" style sites, with unique information. I'd encourage everybody who has something rare to share, to do this in a "classic" way.
I think I wrote the manual about my software in the same manner :) https://www.photopea.com/learn/ It also loads very fast and the HTML code is readable :)
How? By writing new html pages when they take a picture of a new waterfall.
Why? Well, as the FAQ on the site says, because they like waterfalls.
More generally, I'm not sure why it strikes you as odd that this site is still being updated. The old(er) way of building sites hasn't gone anywhere, and this site seems perfectly functional. Why not contenue to update it?
Probably by hand writing HTML. And there's nothing wrong with that at all. It's not even hard. As for the why: the site has a purpose. It's a catalogue of waterfall visits. Of course it's updated if a new waterfall was visited.
By editing the HTML and uploading it again! :) I still do this for lots of older sites I keep going - FTP in (well, SFTP ideally), make a couple of changes in the HTML and you are done. Easy - no compiling or bundling, just edit and save and re-upload (some FTP clients will do the save + re-upload in one-step). Some hosts you just SSH in and edit with vi and save. Aint broke - don't fix it etc.
As for the "why", I guess why does anyone update a website?
https://www.seat61.com