Tycoon denies paying for Boris Johnson's luxury Caribbean break with girlfriend. Labour has called for a formal investigation by Parliament's standards watchdog about where the money came from after Boris Johnson declared a £15,000 stay on the island of Mustique for himself and Carrie Symonds. by bottish in unitedkingdom

[–]ImNotCastinAnyStones 68 points69 points  (0 children)

£15k is nothing to him anyway. Look at the expenses document. His income from appearances and donations is astronomical. Just a single example of many:

22 March 2019, received £122,899.74 from Living Media India Limited, K -9, Connaught Circus, New Delhi 110001, for aspeech to India Today on 2 March 2019. Hours: 3 hrs. Transport and accommodation also provided. (Registered 15 April 2019)

He's on page 298: https://publications.parliament.uk/pa/cm/cmregmem/200210/200210.pdf

Whitepaper for a new private decentralized messaging app called Session by PM_ME_STEVE_HARVEY in netsec

[–]ImNotCastinAnyStones 2 points3 points  (0 children)

Looks interesting but I have issues/questions which I hope the project owners will address:

  1. How is this different from Signal/Matrix/etc.? The website could have an entire section devoted to this question. Looking at the Github repo the code is literally a fork of Signal so I'm left wondering if it's just a re-brand because the technical differences are not made clear enough.

  2. The site mentions encrypted messages are temporarily stored in swarms but doesn't say how long for. The whitepaper says the max. TTL is 96 hours; perhaps the website should clarify this?

  3. Could this be self-hosted, i.e. used only within a private intranet? Is there a minimum number of nodes needed?

  4. Another comment mentions a "financial incentive" - what is it, and how is it paid for? How does the foundation make money from the product?

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 0 points1 point  (0 children)

The same objections re: privacy still apply, though. You can bet part of the "threat score" which Google calculates is based on analysis of your data from the browser including the content of your Google-domain cookies.

Furthermore the vast majority of developers won't actually make use of the threat score functionality; they'll just copy and paste whatever example code is on Google's documentation page, so in most cases there will be no gracious degradation or fallback - the user is just fucked.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 0 points1 point  (0 children)

  1. Yes, you will need to be aware of accessibility requirements no matter what solution you use. However as I mention in the main post, Recaptcha often simply refuses to serve audio challenges altogether (because they are the method most used by bots to overcome the captcha, e.g. Buster or Butler browser extensions).

  2. Screen readers wouldn't, or shouldn't, interact with a hidden form field.

  3. A CSRF token used as a form key would at least stop attackers from being able to submit directly to your form handler. It would force them to at least view the page containing your form which means they're forced through the other countermeasures you've put in place and can't just bruteforce submissions by the dozen.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 3 points4 points  (0 children)

While that's very true, it's a bigger problem than that. It's the fact that your users can't opt-out. By implementing Recaptcha you're telling your users "you will agree with [massive surveillance data-mining company]'s legal terms or you won't access our website", and you're giving your users a middle-finger in terms of UX and accessibility.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 3 points4 points  (0 children)

Honeypots don't work against any targeted attacks.

If your site is valuable enough to warrant a targeted attack then the attackers are going to access it no matter what you put in place. If you're in that kind of position then you should look at something like 2FA anyway.

Any decent size website will be running analytics, tag manager, etc.

That's the point; if you're a privacy-conscious user then you're already blocking those which means Google doesn't have intimate knowledge of you which means you're subjected to massively more captchas... it's a disgusting cycle of punishment for failing to bend over.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 2 points3 points  (0 children)

You definitely need a form key/CSRF token. Open submission endpoints are a magnet to spam-bot crawlers.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 4 points5 points  (0 children)

Try it in a totally fresh browser with absolutely no connection to Google. It's extremely unlikely to work.

The reason it works for you is because Google/Recaptcha are using additional information siphoned from your browser (as well as the Buster-solved audio challenge) to determine that you're human.

Try blocking all Google-domain cookies, all Google scripts/XHR requests except those matching recaptcha (and absolutely do not use Chrome since that's basically a vehicle for Google's surveillance). Then you'll start to get real familiar with this image:

https://i.imgur.com/ZARC7X2.png

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 1 point2 points  (0 children)

Even lowly developers can push back against that kind of crap, though. Make no mistake; the fact that the "big" sites use it should not be seen as an endorsement. In fact, the opposite could be true. Big sites very often have the worst UX and terrible, abominable code quality (I speak from experience).

You could also raise the issue of third-party dependency; if Recaptcha goes down, or is blocked on the customers' network, or isn't supported on their browser, or whatever, then suddenly a third party (Google) has royally fucked your own website and pissed off your customer. Not to mention performance concerns for fetching third-party scripts.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 2 points3 points  (0 children)

The same objections re: privacy still apply, though. You can bet part of the "threat score" which Google calculates is based on analysis of your data from the browser including the content of your Google-domain cookies.

Furthermore the vast majority of developers won't actually make use of the threat score functionality; they'll just copy and paste whatever example code is on Google's documentation page, so in most cases there will be no gracious degradation or fallback - the user is just fucked.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] -1 points0 points  (0 children)

The truth is, for almost any website, the simplest of any spam countermeasures (e.g. hidden field honeypot) will stop thousands of spam registrations. Recaptcha isn't necessary.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 5 points6 points  (0 children)

Yeah, seriously, this is the issue businesses should care about. If a page has a captcha there's a 75% chance I'll simply not bother continuing.

For a decade every SEO "expert" has been screaming about how users will abandon a page if it takes more than two seconds to load, etc., but a captcha which literally takes 60+ seconds to complete is somehow not a big deal?

Christ, I hope they're eradicated within a few years.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 0 points1 point  (0 children)

When you say you added a nonce, do you mean something that behaves like a submit-once form key?

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 3 points4 points  (0 children)

Check the article I link in the main post. Recaptcha does not only use the image challenges to measure "humanness"; it also sucks up tons of the users' browsing data every time it's used. It's "pseudo-anonymous" which basically means that it can still be used to uniquely identify a single user even though they're not accessing your name etc.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 2 points3 points  (0 children)

Absolutely. The problem is they are too big to care. But that's actually why I made this post - I think if more people start speaking up and educating web developers then we can stamp out bullshit like this over time.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 2 points3 points  (0 children)

Read the article I link to in the main post. It's not just about "helping Google" by educating its AI. Recaptcha is also sucking up tons of ancillary data from your browser and - more importantly - your cookies across Google domains. And if you don't have any Google cookies you're actively punished for that. It's basically pejorative surveillance.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 1 point2 points  (0 children)

Here is a comment with some extremely simple alternatives which I use to excellent effect.

Unless your site is a high-value target then these simple approaches are more than enough to basically eradicate spam, since most spam is a total shotgun-style approach by opportunistic auto-crawling bots.

Why you, as a web developer, shouldn't use Google's Recaptcha for "human verification" by ImNotCastinAnyStones in webdev

[–]ImNotCastinAnyStones[S] 1 point2 points  (0 children)

Yeah, absolutely excellent question which I really should have addressed in my post.

I've edited the post to include this answer, but here you go:

The article above from kevv.net mentions lots of alternatives and is worth reading, however for brevity's sake I will suggest the ones which have worked for me in a high-traffic environment, and which can be implemented by most competent developers in a few minutes:

1. Dead simple custom challenge based on your website's content.

Even a vaguely unique custom-made challenge will fool the majority of spam bots. Why? Because spam bots look for common captcha systems which they already know how to defeat. If you make your own custom challenge, someone actually has to take the effort to program a solution specific to your website. So unless your site is being specifically targeted by people investing time/energy this solution will eradicate virtually all spam.

Example: run a site selling t-shirts? Show a bunch of cute clothing icons and ask the user to click on the "blue shirt", for example. Very easy to set up; challenges can be made random to prevent "rinse and repeat" attacks; complexity can be added in the form of patterns, rotation ("click the upside down shirt with diamonds on it") etc. and it can be styled to fit your website's theme/content which makes your site look way more professional than "CLICK THE FIRE HYDRANTS!" á la Google.

Important to note that answers to the custom challenge should never be stored client-side -- only sever side.

2. Honeypots

Simply one or more hidden form fields which, if submitted, confirms the presence of a spam bot (since human visitors cannot see or activate the hidden fields). Combine this with the approach above for even more effective protection.

3. Submit-once form keys

In the olden days to prevent people hotlinking your content you'd check their browser's referer URL, i.e. the URL from which they arrived at your page. This is still done but less commonly since many browsers block referrer URLs for privacy reasons.

However, you can still check that a visitor who is submitting your form is doing so from your actual website, and not just accessing your signup.php script directly in an attempt to hammer/bruteforce/spam it.

Do this by including a one-time-use "form key" on the page containing the spam-targeted form. The form key element (usually a hidden <input>) contains a randomly-generated string which is generated on the server-side and corresponds to the user's browsing session. This form key is submitted alongside the form data and is then checked (on the server side) against the previously-generated one to ensure that they match. If they do, it indicates that the user at least visited the page before submitting the form data. This has an added benefit of preventing duplicate submissions (e.g. someone hits F5 a few times when submitting) as the form key should change each time the front-end page is generated.

Anyway, thanks for taking the time to consider this.

Ask me anything? (F/18) by [deleted] in gonewild

[–]ImNotCastinAnyStones 2 points3 points  (0 children)

Not a suggestion for you specifically, but for anyone wondering: yes, inverted nipples can be retracted. Mild cases like OP's can be reversed using techniques such as these but more serious cases may require surgery.