Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
I have probably saved hundreds of links from such a fate in my org. People there use them for everything even though the media they’re using them in allows them to be clicked (e.g. they’re not going out to print where someone has to type them in).
Thankfully, I’m in a position to un-shorten them before they get published. lol
It might be interesting to have a search engine or someone else who has built a massive list of links visible online generate unshortened forms now before Google shuts down the service.
Thanks. It says that there are already browser plugins that use their database, so looks like there’s already a way on both the scraper and user ends to programmatically avoid link rot here.
That’s a whole lot of link rot about to happen.
I have probably saved hundreds of links from such a fate in my org. People there use them for everything even though the media they’re using them in allows them to be clicked (e.g. they’re not going out to print where someone has to type them in).
Thankfully, I’m in a position to un-shorten them before they get published. lol
It might be interesting to have a search engine or someone else who has built a massive list of links visible online generate unshortened forms now before Google shuts down the service.
https://wiki.archiveteam.org/index.php/URLTeam
Thanks. It says that there are already browser plugins that use their database, so looks like there’s already a way on both the scraper and user ends to programmatically avoid link rot here.
The Jedis are going to feel this one