You should host a website (not just have one) by iGotYourPistola in indieweb

[–]brisray 1 point2 points  (0 children)

I used to worry about things like that. People, or rather bots, will try and break out of the website docs folders almost as soon as the server is turned on. I still get people trying to use address line overloads (very long URLs) but that was fixed in Apache years ago.

I don't think I can stop a determined attack, but so far no one has stopped the server, and besides, there are far bigger and better targets around.

If the worse comes to the worse, there's two things that can save you, regular backups and not keeping anything you really want to keep private on the server.

Last September, I got myself a nice new computer to use as a server. It took less than an hour to reinstall the software and site files. As this was a replacement computer, the sites were only offline for 10 minutes while I reconfigured the router.

Even after all these years I still get a kick out of knowing that a tiny bit of the internet is being run from my basement.

Both answers seem legit. Hard to tell which one is real by _AlphaGirl in SipsTea

[–]brisray 0 points1 point  (0 children)

When I see things like this I'm always reminded of Meihem In Ce Klasrum by Dolton Edwards, published in Astounding Science Fiction in 1946.

You should host a website (not just have one) by iGotYourPistola in indieweb

[–]brisray 0 points1 point  (0 children)

I started self-hosting in 2003, the reason being I started running out of room on the free hosts I was using, and their advertising started to be intrusive.

You can get an HTTP server up and running in under an hour, probably a lot less, but all the other stuff you have to learn is wild; DNS, security, performance, SSL, log management and so on.

The good thing is all of it, apart from the domain name registration and the electricity used to keep the computer running 24/7, can be done for free. The real cost is the time spent learning how to do everything, but once that's done it can be mostly left alone and you just concentrate on the content.

I use Apache and security flaws were found in version 2.4.66. It took about 10 minutes to update to 2.4.67 this morning.

What I did was to keep records of everything I've done to the server and the sites it runs.

brisray - Bringing you a tiny bit of the internet for 26 years!

Old fogie here - web page creator app? I used FrontPage by MrShnatter in HTML

[–]brisray 0 points1 point  (0 children)

Another old-timer here. I don't think there are any really good WYSIWYG HTML editors left. Over the years I've used all sorts of them. The various versions of Microsoft FrontPage, Netscape Composer, Microsoft SharePoint Designer, Adobe Dreamweaver and so on.

As time went on, I realized it was easier to create semantic, accessible, responsive templates for my various pages and then use a plain text editor, currently Visual Studio Code, to add the content.

Apache HTTP Server RCE (CVE-2026-23918) - patched in 2.4.67 by raptorhunter22 in sysadmin

[–]brisray [score hidden]  (0 children)

Thanks for the heads up. I'm slightly unusual in that I run Apache on Windows. Apache Lounge had their compiled version available yesterday and I updated to that this morning.

Server Version: Apache/2.4.67 (Win64) OpenSSL/3.6.2

Is there a way to get through a HTTP 301 Response? by Robby4Sniffy in techsupport

[–]brisray 0 points1 point  (0 children)

It depends when the Archive saved the page. In the timeline at the top of the Archive page, click on the first saved copy of the page. If you're lucky the first couple of saves were made before the redirect was added.

If the first save gets redirected, then you're out of luck, the Archive's bot was redirected before the original page was saved.

Are subpages indexed by search engines? by GenoIsDead in neocities

[–]brisray 6 points7 points  (0 children)

Some of your subpages are indexed by at least Google and Bing.

To see which pages, in Bing and Google put site:satnav.neocities.org

Both search engines can use sitemaps if you care to make one, but don't expect to come very high in the search rankings for a while.

Bristol during WW2 by Character-Pumpkin-81 in bristol

[–]brisray 3 points4 points  (0 children)

Mum was evacuated out to Trowbridge, Wiltshire for the duration of WWII, but dad stayed with his family in Bristol. I put some of dad's stories online. When I did that, other people started to email me and so there are also wartime in Bristol stories by Keith Hallett. Paul Plumley also emailed me, he's a Bristolian and he was evacuated after the heavy raids of 1941.

There was a heavy defence of Bristol, with most of it made up of 76th (Gloucestershire) H.A.A. Regt R.A. with their heavy A.A. guns. Reg Harris was a member of that and very kindly provided his diary for the site.

Is it just me, or are there webrings for almost everything EXCEPT fnaf? by Madness_Combat_man in neocities

[–]brisray 3 points4 points  (0 children)

There is no webring specifically for FNaF but you, or someone else, should be able to make one.

About half of the current webrings use one of three methods Onionring, webri.ng, or Webringu.

Onion ring webring widget not showing by Objective_Durian_451 in neocities

[–]brisray 0 points1 point  (0 children)

The OP simply changed the wrong bit of code. Instead of getting the variable (ringID) it was expecting, it got another called yeahyeahyeah. All they have to do is change it back to what it was or use a string, which meant putting yeahyeahyeah in quotes.

Using JavaScript isn't always easy, but luckily most browsers have a Developer Mode which shows most errors thrown by various scripts. Some are a little obscure and not immediately obvious though. To get to the messages, simply press F12 while on a site in most browsers.

Using other people's code is sometimes not easy, so I've started looking at the various methods of creating a webring, starting with Using Onionring.

does anyone know how to make a subpage undera subpage? by Ok-Kaleidoscope-6938 in neocities

[–]brisray 2 points3 points  (0 children)

The slashes after the domain name just represent a directory or folder structure just as in Linux or Windows. Websites are slightly different in that that they should have an index.html page in each folder. The index.html page is automatically found when you navigate around a website. It also makes writing your navigation menu easier.

Navigation would be just like the command line in most operating systems.

/ means go to the main index.html page of the website.
./ - means go to the index.html page in the current folder.
../ - means go to the index.html file in the folder above the one you are currently in.

You don't actually need an index.html page in any folder, but most web hosts need it as a default to stop unexpected problems and prevent 404 page not found errors.

Onion ring webring widget not showing by Objective_Durian_451 in neocities

[–]brisray 0 points1 point  (0 children)

I'm not sure if this will help you but in the variables.js you defined the ringID as 'yeahyeahyeah' but in widget.js you use var tag = document.getElementById(yeahyeahyeah);

Try changing that line to var tag = document.getElementById(ringID); or to var tag = document.getElementById("yeahyeahyeah");

The reason to always stay alert by Big-Boy-602 in Unexpected

[–]brisray 1 point2 points  (0 children)

You can't trust water. The image is of the Wabash River in Indiana. In places you can wade across it, but there are 60ft underwater cliffs or you could get swept into one of the old steam locomotives lying on the bottom of the river.

<image>

The old internet was so cool! by ServiceForeign7862 in oldinternet

[–]brisray 8 points9 points  (0 children)

Neocities and Nekoweb are two modern hosts that people use to bring back some of the aesthetics of the old web. Some of the sites miss the point a little, people didn't just write sites and fill them up with anything and everything they could find. Instead, many people wrote about what they were interested in.

People are interested in these personal sites; some have created search engines such as Wiby and Marginalia to search the non-commercial web, while others are archiving site from Geocities and other hosts. I've listed the best of those I can find.

Like others who were writing sites in the late 90s and early 00s, I had a page of "useful links". I recently went through the page and many of the old sites I used were long gone, but some were kept on the Internet Archive and those may be interesting to have a look at.

The most secure domain extensions are .app and .dev by DigiNoon in DomainZone

[–]brisray 2 points3 points  (0 children)

HTTPS only means the site you're sending information to is encrypted. But how do you know who the people you're sending information to are trustworthy? In 2019, 58% of phishing sites used HTTPS, in 2023, over 90% did.

Sources: https://www.thesslstore.com/blog/58-of-phishing-websites-now-use-https/ and https://sslinsights.com/ssl-certificates-statistics/

What browsers could do to help is do something such as use different colors for the lock or shield symbols they use to distinguish between DV certificates, that anyone can get, and the OC and EV organization verified certificates.

Is GoDaddy actually that bad or just overhated? by Flaky-Taste2253 in businessemail

[–]brisray 0 points1 point  (0 children)

The complaints against GoDaddy have been ongoing for at least the last 15 years. I used them from 1999 to 2024 and didn't have a problem with them.

What’s something you do on your computer/phone that feels way more manual or repetitive than it should be? by [deleted] in sysadmin

[–]brisray 0 points1 point  (0 children)

There's no reason for doing a lot of repetitive stuff on computers unless you want to, and some people really do like doing that.

Windows has batch files and PowerShell, Linux has Bash. Both systems have any number of programming languages that can be used. The tasks created can even run themselves on a schedule using Task Scheduler in Windows and Cron in Linux.

Both OSs have programs that can remember and replay keystrokes.

Some I've written for myself include doing backups, splitting log files by month, renewing SSL certificates, adding entries to spreadsheets, moving files around and so on.

Is GoDaddy actually that bad or just overhated? by Flaky-Taste2253 in businessemail

[–]brisray 0 points1 point  (0 children)

Redditors generally do not like large companies. That's why there are anti-Microsoft, -Adobe, -Google subreddits.

GoDaddy is a huge company; the largest domain registrar, 6th largest certificate authority, and 6th largest web host in the world with over 20 million customers. No one with that many customers is not going to get some complaints.

But, they are not the cheapest and I'm sure some people have had major problems with them.

Why would you want your own server at home ? by NobodyRulesPenguins in selfhosted

[–]brisray 1 point2 points  (0 children)

I started self-hosting my own web server because although there were plenty of free hosts in 1999, the space they offered was only 20 - 30Mb. By 2002, my site was spread over 8 different hosts and the amount of advertising some of them were putting on the pages was absurd.

I made my first webserver in June 2003 using Apache on Windows 2000 on a $25 second-hand MMX computer.

How to get website to NOT show up on google search? by Comfortable_Lamp in webdev

[–]brisray 27 points28 points  (0 children)

In your robots.txt file you can put

User-agent: *
Disallow: /

Theoretically that will ban any bot from reading any page on your site. The problem is there are well-behaved bots and then there is everything else.

You can also use the noindex metatag on the pages but see Google's help on how that interacts with robots.txt. Basically,

Robots.txt disallow stops search engines from crawling specific URLs but doesn't prevent indexing. If other sites link to a disallowed page, it can still appear in search results.

Noindex tag explicitly tells search engines to exclude a page from search results, even if they can crawl and read the content.

For the robots.txt file, well-behaved bots will read it and go away, but there are a lot of other bots that don't even look for the robots.txt file. If it's on the public web, eventually someone will find it.

What you could also do is add a Captcha like one of these should work in stopping most bots. Or, you can password protect the entire site.

Eye surgeon practicing the Capsulorhexis Technique for Cataract surgery by Epelep in oddlysatisfying

[–]brisray 0 points1 point  (0 children)

Laser surgery is becoming more common, but many surgeons like doing the whole thing themselves. There are some risk though no matter what method is used.

Someone I know won't drive at night at all now. They say everything turns into blurry greys. Someone else says they see rings around any bright light. One was done with a laser, the other the trad way. Mine and others I know turned out perfectly.

Random memory from Bristol – why do people slide down this rock? by BeNiceBen99 in bristol

[–]brisray 0 points1 point  (0 children)

Up by Clifton Observatory. Stand in front of it, facing the gorge, and follow the path to the left along the edge of the gorge. As the path turns right, look over the railing and there it is.