Dear fedi, I am thinking of building-in some LLM scraper bot traps into my website.
One of the ideas is links down near the bottom of each blogpost or page that are hidden in CSS (so that no human would click them) that when clicked immediately put the client IP address on naughty list.
I want to understand better how CSS-hidden links work for #Blind visitors and others using screen readers or other assistive technologies.
The last thing I'd want is to inconvenience any human! 
1/3
I am obviously doing research about this myself, so no, I am not going to only throw this out into the void and expect free advice.
But, if you do have advice or suggestions or notes or input here, I would love to hear it. 
Especially if you use assistive technologies, or have experience/expertise in that area.
If you are building your own LLM scraper or are otherwise an AI bro, and have Opinions about the Open Web, please feel free to go suck a lemon instead.
2/3
@rysiek@mstdn.social For your information, some people also have CSS disabled on their browser or use HTML-only browsers that do not support CSS, so they would also be at risk of being caught by the trap.
@Chishiki611
I quite often do also switch off CSS *) to read static text and images, because so many sites are hard to read at all with all the dynamic gimmicks, contents invisible unless some action happening, everything "dynamic" rearranged
What i strongly suggest is a warning text that humans can easily understand, like "do not click the following link, it will blacklist you"
Would be AMAZING is someone could build an CSS-modificator to make pages static
> Would be AMAZING is someone could build an CSS-modificator to make pages static
Not sure I follow? Static in the sense of animations and transitions and such?
@rysiek@mstdn.social @Laberpferd@sueden.social That is correct IIRC.
@Chishiki611 @Laberpferd hmmm, I *think* this could be done with a few lines of CSS.
Something like:
* {
backdrop-filter: none !important;
transition: none !important;
animation: none !important;
}
(added the backdrop filter thing because blur is one of my pet peeves)
This could be made into a bookmarklet I'm sure.
@Chishiki611 @rysiek
I try to explain this more detailed later
Basically i mean websites where most sub-parts, hidden or blurred pictures, menus and further links are hidden by default, and when you move the mouse over trigger spots or click on buttons then the page gets rearranged to show some sub-parts
Often navigation, finding subpages is not obvious at all
Thats where i almost universally switch off CSS that i can see an raw page with all the texts and pictures as a static document
@Laberpferd ah I see, so it's less about animations and transitions, and more about making all the elements available/visible immediately and at the same time.
Yeah, that's… not trivial to do.
@rysiek
For myself, i have much less issues with transitions and animations which are only a local pollution in an otherwise unchanged layout
Whats primarily exhausting me is when the layout itself changes and i have to search and guess where elements could be right now (if visible at all)