There may actually be some utility here. LLM agents refuse to traverse the links. Tested with gemini-3-pro, gpt-5.2, and opus 4.5.
edit: gpt-oss 20B & 120B both eagerly visit it.
I wish this came a day earlier.
There is a current "show your personal site" post on top of HN [1] with 1500+ comments. I wonder how many of those sites are or will be hammered by AI bots in the next few days to steal/scrape content.
If this can be used as a temporary guard against AI bots, that would have been a good opportunity to test it out.
AI bots (or clients claiming to be one) appear quite fast on new sites, at least that's what I saw recently in few places. They probably monitor Certificate Transparency logs - you won't hide by avoiding linking. Unless you are ok with staying in the shadow of naked http.
Get a wildcard cert and use it behind a reverse proxy.
Okay, but then what? Host your sites on something other than 'www' or '*', exclude them from search engines, and never link to them? Then, the few people who do resolve these subdomains, you just gotta hope they don't do it using a DNS server owned by a company with an AI product (like Google, Microsoft, or Amazon)?
I really don't know how you're supposed to shield your content from AI without also shielding it from humanity.
I posted my site on the thread.
My site is hosted on Cloudflare and I trust its protection way more than flavor of the month method. This probably won't be patched anytime soon but I'd rather have some people click my link and not just avoid it along with AI because it looks fishy :)
FYI Cloudflare protection doesn't mean much nowadays if someone is slightly determined to scrape the site
Unless you mean DDoS protection, this one helps for sure
Yeah I meant using it as an experiment to test with two different links(or domains) and not as a solution to evade bot traffic.
Still, I think it would be interesting to know if anybody noticed a visible spike in bot traffic(especially AI) after sharing their site info in that thread.
I didn't: no traffic before sharing, none since.
I've been considering how feasible it would be to build a modern form of the denial of service low orbit ion cannon by having various LLMs hammer sites until they break. I'm sure anything important already has Cloudflare style DDOS mitigation so maybe it's not as effective. Still, I think it's only a matter of time before someone figures it out.
There have been several amplification attacks using various protocols for DDOS too...
Glad I’m not the only one who felt icky seeing that post.
I agree my tinfoil hat signal told me this was the perfect way to ask people for bespoke, hand crafted content - which of course AI will love to slurp up to keep feeding the bear.
I think that something specifically intended for this, like Anubis, is a much better option.
Anubis flatly refuses me access to several websites when I'm accessing them with a normal Chromium with enabled JS and whatnot, from a mainstream, typical OS, just with aggressive anti-tracking settings.
Not sure if that's the intended use case. At least Cloudflare politely masks for CAPTCHA.
What do you mean "refuses"? The worst it should do is serve up a high difficulty proof of work. Unless it gained new capabilities recently?
Are you sure the block isn't due to the authors of those websites using some other tool in addition?
I thought that Anubis solely is proof of work, so I'm very curious as to what's going on here.
Of course, the downside is that people might not even see your site at all because they’re afraid to click on that suspicious link.
Site should add a reverse lookup. Provide the poison and antidote.
Bitly does that, just add '+' to Bitly URL (probably other shorteners, too).
How is AI viewing content any different from Google? I don’t even use Google anymore because it’s so filled with SEO trash as to be useless for many things.
Try hosting a cgit server on a 1u server in your bedroom and you'll see why.
LLM led scraping might not as it requires an LLM to make a choice to kick it off, but crawling for the purpose of training data is unlikely to be affected.
Sounds like a useful signal for people building custom agents or models. Being able to control whether automated systems follow a link via metadata is an interesting lever, especially given how inconsistent current model heuristics are.
I think it’s perfectly reasonable to make something useless for fun, it’s an interesting idea.
But what I’d like to understand is why there are so many of the same thing. I know I’ve seen this exact idea multiple times on HN. It’s funny the first time, but once it’s done once and the novelty is gone (which is almost immediately), what’s the point of another and another and another?
I think it's just someone learning something new most of the time.
I have home made url shorteners in go, rust, java, python, php, elixir, typescript, etc. why? because I'm trying the language and this kind of project touches on many things: web, databases, custom logic, how and what design patterns can I apply using as much of the language as I can to build the thing.
Right. But the question is why redo the exact same joke? Why not come up with another twist (like the URL lengthener) or do no twist but be useful?
I’m not criticising the author or anyone who came before. I’m trying to understand the impetus between redoing a joke that isn’t yours. You don’t learn anything new by redoing the exact same gag that you wouldn’t learn by being even slightly original or making the project truly useful.
Ideas are a dime a dozen. You could make e.g. a Fonzie URL shortener (different lengths of “ayyyyy”), or an interstellar one (each is the name of a space object), or a binary one (all ones and zeroes)… Each of those would take about the same effort and teach you the same, but they’re also different enough they would make some people remember them, maybe even look at the author and their other projects, instead of just “oh, another one of these, close”.
If you're learning, it's better to recreate something exactly as it is, so that you have something against which to verify your output. Plus, not everyone is an idea person, and I'd wager that most devs are implementation people, not idea people.
I’d argue that if you’re learning and are so inexperienced you need to recreate something exactly, you should instead recreate something real and useful—of which there are more examples—than one joke.
Plus, I don’t think I’ve seen another of these which is exactly like this (just extremely close in concept), so the argument doesn’t hold.
A joke isn’t the best example because there are jokes that never changes but the delivery is a sign of mastery. The Aristocrats is like Bach’s cello suite for comedians.
The Aristocrats is a special case where the setup is the joke instead of the punchline. The point is the inventiveness of the journey. If it was told with the same setup every time, it wouldn’t be funny.
I actually forgot that this had been done before until you mentioned it.
Giving the author the benefit of the doubt, they may have not seen it before, or was bored and just wanted to make a toy.
And it seems like many in HN are in enough a similar boat to me to have up voted it to trending, so at least some people found it entertaining, so it fulfilled its purpose I suppose.
It's a good question though, and I don't think anyone really knows the answer.
I’ve been browsing this site for a decade plus and this idea was new to me. Maybe the author is in the same boat.
Edit: I see referencnes to shadyurl in the comments and I have heard of that, but probably wouldn’t have thought of it.
Fair. I’d think they would look for prior work beforehand, but that’s perfectly valid.
Again, this was not a criticism, but a genuine question.
A fun project doesn't need to be original, IMO.
URL Shortener is still one of the most popular System Design questions, building this project is a great way to have some experience / understanding of it, for example.
> A fun project doesn't need to be original, IMO.
I agree. But a URL shortener with a twist isn’t just fun, it’s funny. The joke—as opposed to the usefulness—is what’s interesting about it. But when the same joke is overdone, it’s no longer funny.
> building this project is a great way to have some experience / understanding of it
One reason is that not all these websites manage to make equally "creepy" links, even though the basic idea is the same. I remember one version which was a lot more alarming than the current example, with links containing a mix of suspicious content hinting at viruses, phishing, piracy/warez sites, pornography (XXX cams), and Bitcoin scams. I don't remember that website, but the current case seems rather weak by comparison.
That makes it even more confusing. If you’re making something creepy, I can see the argument for “whatever exists isn’t creepy enough, I’ll do it better” but not the reverse.
It's possible the current website is older, or that the creator doesn't know about better alternatives. (Also, they do produce rather short links, unlike some of the others, which don't pass as "URL shorteners". Though not sure whether that's relevant.)
I got one where the called script ended in ".pl" and I had a flashback to the 90s. My trousers grew into JNCOs, Limp Bizkit started playing out of nowhere and I got a massive urge to tell Slashdot that Alan Thicke had died.
With Firefox on Android it simply says
Deceptive site issue
This web page at [...] has been reported as a deceptive site and has been blocked based on your security preferences.
What's going on? I can't find any setting to disable this.
TBF it ought to trigger even the simplest heuristics so it wouldn't surprise me if it was automatically categorized that way.
[deleted]
Imagine using this as your personal website lol
email too
BRILLIANT! Even Chrome says nope/DANGEROUS to a creepified link to mail.google.com
What's up with the creepy ads on this website? It seems like they are actually sketchy ads and not just fake ads for comedic effect. One shows some scammy nonsense about your device being infected and the other links to a real VPN app.
That's just the ambient creepiness of the internet. It's a creepy place!
This is probably the result of a context based ad network serving sketchy adds because of the suspicious url content.
I would also like to have something like this, but for "vintage" links - something that looks like it was from the late 90s.
You seem to be able to encode arbitrary text, so long as it follows [A-Za-z0-9]+\.[A-Za-z0-9]+
Please don’t use 3rd party relays for your URLs. It’s bad enough to have your own server, domain, etc. as single points of failure and bottlenecks without adding a 3rd party into the mix, who either themselves or someone that takes over their domain later track users, randomly redirect your users to a malicious site, or just fail.
I know people have fond memories of long ago when they thought surely some big company’s URL shortener would never be taken down and learned from that when it later was.
This! I've run into very frustrating examples of legit sites doing that, for no defensible reason at all.
For example, the healthcare.gov emails. For links to that domain, they would still transform them with lnks.gd, even though:
1) The emails would be very long and flashy, so they're clearly not economizing on space.
2) The "shortened" URL was usually longer!
3) That domain doesn't let you go straight to the root and check where the transformed URL is going.
It's training users to do the very things that expose them to scammers!
The other day in a Facebook Messenger group chat I tried to link to https://motherfuckingwebsite.com/ as a joke, but Messenger kept blocking it. It's quite overzealous with its blocking.
This is legit! If you disable your adblock you even get a suspicious ad
This is great. It created a link to my personal site that Firefox blocked me from going to.
Saw this on relaunched Digg and figured HN would appreciate it.
Now that's a name I've not heard in a long time
I don't appreciate how AI generated this website looks.
It seems appropriate that, for a website whose purpose is to make links which raise your suspicions, the visual design itself also raises your suspicions.
Just looks like every other generic framework oriented site.
which bit are you getting an AI smell from?
gradient background, card, button
Have you looked at a website in the last 10 years?
Perhaps, but nearly every tutorial in all the modern frameworks demonstrate this exact style.
Digg is back?
Edit: looks like you need an invite code.
Bummer
[deleted]
This is fun. Is it not checking for previously submitted URLs though? I can seemingly re-submit the exact same URL and get a new link every time. I would expect this to fill the database unnecessarily but I have no idea how the backend works.
Am I missing something, or would these essentially be implemented via DNS records? It's not clear to me that keeping the links in a database would be necessary at all (unless the DNS records are what you mean by "database")
DNS is only for resolving the host part. The path is not passing through a dns query.
In example.com/blah, the /blah part is interpreted by the host itself.
And apart from that I would indeed consider DNS records a database.
Yeah but have fun explaining yourself to the police when the author abandons the project and an actual scammer ends up buying up all those domains.
Just wondering.
so you bought c1ic.link and web-safe.link. That's very cool
I like how old-school HN comment section does not care about creepy links at all. Or link for that matter.
Msn.com
Office.com
Sharepoint.com
Hotmail.com
Etc, plus all the subdomains they insert before them. It makes it very easy to create phishing emails that look plausible.
microsoftonline.com is one of my favorites. Like how can you look any more scammy :D
Firefox is freaking out on some of these. It's hilarious.
I am sharing content using these creepy links to send to office people.
Please take my upvote. :)
Haha, it's fun. Just thinking, is there some place where creepy links would be better ?
I've been at a company that internally sends out fake links that log the user and links to an educational page on internet safety.
I honestly don't mind too much since it's a once a year thing (hacktober) and honestly companies should be trying to catch out employees who click any and all links.
We used to have fun hammering millions of requests to such URLs from a VPS when they would send such emails to role mailboxes.
Eventually we got asked to please make it stop. I asked them to please stop sending fake phishing emails to robots.
I added google.com and it spit out https://twitterDOTc1icDOTlink/install_Jy7NpK_private_videoDOTzip
Interesting that it spit out a .zip url. Was not expecting that so I changed all the “.” to “DOT” so I don’t get punished for posting a spammy link despite this literally being a website to make links as spammy and creepy as possible.
punished by whom?
lol, I'm not clicking a .vbs link
It is hilarious and i'm not clicking any link lol.
lol
[dead]
[deleted]
[dead]
[dead]
[dead]
Please don't make any more URL shorteners, they are just a bad idea.
I always end up making my own, they're so simple to write.
Saves using one of the "free" ones which looks like its free but you're actually on a free trial, then you can't access your links after that trial expires.
There may actually be some utility here. LLM agents refuse to traverse the links. Tested with gemini-3-pro, gpt-5.2, and opus 4.5.
edit: gpt-oss 20B & 120B both eagerly visit it.
I wish this came a day earlier.
There is a current "show your personal site" post on top of HN [1] with 1500+ comments. I wonder how many of those sites are or will be hammered by AI bots in the next few days to steal/scrape content.
If this can be used as a temporary guard against AI bots, that would have been a good opportunity to test it out.
1. https://news.ycombinator.com/item?id=46618714
AI bots (or clients claiming to be one) appear quite fast on new sites, at least that's what I saw recently in few places. They probably monitor Certificate Transparency logs - you won't hide by avoiding linking. Unless you are ok with staying in the shadow of naked http.
Get a wildcard cert and use it behind a reverse proxy.
Okay, but then what? Host your sites on something other than 'www' or '*', exclude them from search engines, and never link to them? Then, the few people who do resolve these subdomains, you just gotta hope they don't do it using a DNS server owned by a company with an AI product (like Google, Microsoft, or Amazon)?
I really don't know how you're supposed to shield your content from AI without also shielding it from humanity.
I posted my site on the thread.
My site is hosted on Cloudflare and I trust its protection way more than flavor of the month method. This probably won't be patched anytime soon but I'd rather have some people click my link and not just avoid it along with AI because it looks fishy :)
FYI Cloudflare protection doesn't mean much nowadays if someone is slightly determined to scrape the site
Unless you mean DDoS protection, this one helps for sure
Yeah I meant using it as an experiment to test with two different links(or domains) and not as a solution to evade bot traffic.
Still, I think it would be interesting to know if anybody noticed a visible spike in bot traffic(especially AI) after sharing their site info in that thread.
I didn't: no traffic before sharing, none since.
I've been considering how feasible it would be to build a modern form of the denial of service low orbit ion cannon by having various LLMs hammer sites until they break. I'm sure anything important already has Cloudflare style DDOS mitigation so maybe it's not as effective. Still, I think it's only a matter of time before someone figures it out.
There have been several amplification attacks using various protocols for DDOS too...
Glad I’m not the only one who felt icky seeing that post.
I agree my tinfoil hat signal told me this was the perfect way to ask people for bespoke, hand crafted content - which of course AI will love to slurp up to keep feeding the bear.
I think that something specifically intended for this, like Anubis, is a much better option.
Anubis flatly refuses me access to several websites when I'm accessing them with a normal Chromium with enabled JS and whatnot, from a mainstream, typical OS, just with aggressive anti-tracking settings.
Not sure if that's the intended use case. At least Cloudflare politely masks for CAPTCHA.
What do you mean "refuses"? The worst it should do is serve up a high difficulty proof of work. Unless it gained new capabilities recently?
Are you sure the block isn't due to the authors of those websites using some other tool in addition?
I thought that Anubis solely is proof of work, so I'm very curious as to what's going on here.
Of course, the downside is that people might not even see your site at all because they’re afraid to click on that suspicious link.
Site should add a reverse lookup. Provide the poison and antidote.
Bitly does that, just add '+' to Bitly URL (probably other shorteners, too).
How is AI viewing content any different from Google? I don’t even use Google anymore because it’s so filled with SEO trash as to be useless for many things.
Try hosting a cgit server on a 1u server in your bedroom and you'll see why.
LLM led scraping might not as it requires an LLM to make a choice to kick it off, but crawling for the purpose of training data is unlikely to be affected.
Sounds like a useful signal for people building custom agents or models. Being able to control whether automated systems follow a link via metadata is an interesting lever, especially given how inconsistent current model heuristics are.
I think it’s perfectly reasonable to make something useless for fun, it’s an interesting idea.
But what I’d like to understand is why there are so many of the same thing. I know I’ve seen this exact idea multiple times on HN. It’s funny the first time, but once it’s done once and the novelty is gone (which is almost immediately), what’s the point of another and another and another?
I think it's just someone learning something new most of the time.
I have home made url shorteners in go, rust, java, python, php, elixir, typescript, etc. why? because I'm trying the language and this kind of project touches on many things: web, databases, custom logic, how and what design patterns can I apply using as much of the language as I can to build the thing.
Right. But the question is why redo the exact same joke? Why not come up with another twist (like the URL lengthener) or do no twist but be useful?
I’m not criticising the author or anyone who came before. I’m trying to understand the impetus between redoing a joke that isn’t yours. You don’t learn anything new by redoing the exact same gag that you wouldn’t learn by being even slightly original or making the project truly useful.
Ideas are a dime a dozen. You could make e.g. a Fonzie URL shortener (different lengths of “ayyyyy”), or an interstellar one (each is the name of a space object), or a binary one (all ones and zeroes)… Each of those would take about the same effort and teach you the same, but they’re also different enough they would make some people remember them, maybe even look at the author and their other projects, instead of just “oh, another one of these, close”.
If you're learning, it's better to recreate something exactly as it is, so that you have something against which to verify your output. Plus, not everyone is an idea person, and I'd wager that most devs are implementation people, not idea people.
I’d argue that if you’re learning and are so inexperienced you need to recreate something exactly, you should instead recreate something real and useful—of which there are more examples—than one joke.
Plus, I don’t think I’ve seen another of these which is exactly like this (just extremely close in concept), so the argument doesn’t hold.
A joke isn’t the best example because there are jokes that never changes but the delivery is a sign of mastery. The Aristocrats is like Bach’s cello suite for comedians.
The Aristocrats is a special case where the setup is the joke instead of the punchline. The point is the inventiveness of the journey. If it was told with the same setup every time, it wouldn’t be funny.
I actually forgot that this had been done before until you mentioned it.
Giving the author the benefit of the doubt, they may have not seen it before, or was bored and just wanted to make a toy.
And it seems like many in HN are in enough a similar boat to me to have up voted it to trending, so at least some people found it entertaining, so it fulfilled its purpose I suppose.
It's a good question though, and I don't think anyone really knows the answer.
I’ve been browsing this site for a decade plus and this idea was new to me. Maybe the author is in the same boat.
Edit: I see referencnes to shadyurl in the comments and I have heard of that, but probably wouldn’t have thought of it.
Fair. I’d think they would look for prior work beforehand, but that’s perfectly valid.
https://xkcd.com/1053/
Again, this was not a criticism, but a genuine question.
A fun project doesn't need to be original, IMO.
URL Shortener is still one of the most popular System Design questions, building this project is a great way to have some experience / understanding of it, for example.
> A fun project doesn't need to be original, IMO.
I agree. But a URL shortener with a twist isn’t just fun, it’s funny. The joke—as opposed to the usefulness—is what’s interesting about it. But when the same joke is overdone, it’s no longer funny.
> building this project is a great way to have some experience / understanding of it
https://news.ycombinator.com/item?id=46632329
One reason is that not all these websites manage to make equally "creepy" links, even though the basic idea is the same. I remember one version which was a lot more alarming than the current example, with links containing a mix of suspicious content hinting at viruses, phishing, piracy/warez sites, pornography (XXX cams), and Bitcoin scams. I don't remember that website, but the current case seems rather weak by comparison.
That makes it even more confusing. If you’re making something creepy, I can see the argument for “whatever exists isn’t creepy enough, I’ll do it better” but not the reverse.
It's possible the current website is older, or that the creator doesn't know about better alternatives. (Also, they do produce rather short links, unlike some of the others, which don't pass as "URL shorteners". Though not sure whether that's relevant.)
Related: A URL shortener not shortening the URL but makes it look very dodgy (434 points, 2023, 100 comments) https://news.ycombinator.com/item?id=34609461
That's less a URL shortener and more a URL dodgifier.
To be fair, the one in the OP also did not shorten any of the links I gave it.
The key point here is "not shortening"
My favorite link of all time:
https://jpmorgan.c1ic.link/logger_zcGFC2_bank_xss.docm
Definitely not meta
I got one where the called script ended in ".pl" and I had a flashback to the 90s. My trousers grew into JNCOs, Limp Bizkit started playing out of nowhere and I got a massive urge to tell Slashdot that Alan Thicke had died.
With Firefox on Android it simply says
Deceptive site issue
This web page at [...] has been reported as a deceptive site and has been blocked based on your security preferences.
What's going on? I can't find any setting to disable this.
NextDNS is blocking it too (https://google.c1ic.link/lottery_qrdLCz_account_verification). The reason is that Google Safe Browsing considers that site as unsafe.
TBF it ought to trigger even the simplest heuristics so it wouldn't surprise me if it was automatically categorized that way.
Imagine using this as your personal website lol
email too
BRILLIANT! Even Chrome says nope/DANGEROUS to a creepified link to mail.google.com
What's up with the creepy ads on this website? It seems like they are actually sketchy ads and not just fake ads for comedic effect. One shows some scammy nonsense about your device being infected and the other links to a real VPN app.
That's just the ambient creepiness of the internet. It's a creepy place!
This is probably the result of a context based ad network serving sketchy adds because of the suspicious url content.
I would also like to have something like this, but for "vintage" links - something that looks like it was from the late 90s.
I use them in tests, just for fun: https://github.com/ClickHouse/ClickHouse/blob/master/tests/q...
There was a "shadyurl". The site itself seems to be long gone, but this'll give you some context: https://www.mikelacher.com/work/shady-url/
There's an example shadyurl link in here: https://news.ycombinator.com/item?id=14628529
Funnily enough the domains appear to have been bought up and are now genuinely shady.
IIRC, shadyurl was the original version of this. Doesn't seem to be around anymore, though.
shadyurl a whole bunch of different incredibly shady domains that were used at random. it was beautiful.
I can't tell if the website works as advertised because I don't want to open the generated links
https://jpmorgan.c1ic.link/G4JQKX_money_request.dll
https://jpmorgan.web-safe.link/flash_7KzCZd_money_request
I love this version and I hope you do too.
well played sir
I'm not sure what the use case for this is, but I've been using it as a inefficient messaging service with my girlfriend, ie:
https://c1ic.link/campaign_WxjLdF_login_page_2.bat
You seem to be able to encode arbitrary text, so long as it follows [A-Za-z0-9]+\.[A-Za-z0-9]+
Please don’t use 3rd party relays for your URLs. It’s bad enough to have your own server, domain, etc. as single points of failure and bottlenecks without adding a 3rd party into the mix, who either themselves or someone that takes over their domain later track users, randomly redirect your users to a malicious site, or just fail.
I know people have fond memories of long ago when they thought surely some big company’s URL shortener would never be taken down and learned from that when it later was.
This! I've run into very frustrating examples of legit sites doing that, for no defensible reason at all.
For example, the healthcare.gov emails. For links to that domain, they would still transform them with lnks.gd, even though:
1) The emails would be very long and flashy, so they're clearly not economizing on space.
2) The "shortened" URL was usually longer!
3) That domain doesn't let you go straight to the root and check where the transformed URL is going.
It's training users to do the very things that expose them to scammers!
This had to be done:
https://wellsfargo.c1ic.link/TODO_obfuscate_url_8wyS7G_hot_s...
Thought this might be it... I clicked it anyways haha. I will need to update the Rick roll url on my nfc implant with this new link!
It would've been top-notch if it actually sometimes just used Outlook/O365 or similar vendor's "safelinks" redirector that they use.
Fantastic! I miss the original ShadyURL.
https://news.ycombinator.com/item?id=31386108
I wouldn't call it a shortener, since most of the links it creates are longer than the originals.
What would be a good name here? A URL redirector?
It's an asymptotic link shortener
Same here I took a six character url, and it turned into at least ten.
URL lengthener. :D
Huh, https://looooooooooooooooooooooooooooooooooooooooooooooooooo... comes to mind.
Thanks!
link obfuscator
The other day in a Facebook Messenger group chat I tried to link to https://motherfuckingwebsite.com/ as a joke, but Messenger kept blocking it. It's quite overzealous with its blocking.
For funsies I shortened https://creepylink.com
And got: https://c1ic.link/account_kPvfG7_download_now.bat
I also tried that and got https://twitter.web-safe.link/BUuLrg_document.zip
Squared:
https://c1ic.link/ad_k9OFWW_redeem_gift.bat
This is legit! If you disable your adblock you even get a suspicious ad
This is great. It created a link to my personal site that Firefox blocked me from going to.
Saw this on relaunched Digg and figured HN would appreciate it.
Now that's a name I've not heard in a long time
I don't appreciate how AI generated this website looks.
It seems appropriate that, for a website whose purpose is to make links which raise your suspicions, the visual design itself also raises your suspicions.
Just looks like every other generic framework oriented site.
which bit are you getting an AI smell from?
gradient background, card, button
Have you looked at a website in the last 10 years?
Perhaps, but nearly every tutorial in all the modern frameworks demonstrate this exact style.
Digg is back?
Edit: looks like you need an invite code.
Bummer
This is fun. Is it not checking for previously submitted URLs though? I can seemingly re-submit the exact same URL and get a new link every time. I would expect this to fill the database unnecessarily but I have no idea how the backend works.
Am I missing something, or would these essentially be implemented via DNS records? It's not clear to me that keeping the links in a database would be necessary at all (unless the DNS records are what you mean by "database")
DNS is only for resolving the host part. The path is not passing through a dns query.
In example.com/blah, the /blah part is interpreted by the host itself.
And apart from that I would indeed consider DNS records a database.
Yeah but have fun explaining yourself to the police when the author abandons the project and an actual scammer ends up buying up all those domains.
Just wondering. so you bought c1ic.link and web-safe.link. That's very cool
I like how old-school HN comment section does not care about creepy links at all. Or link for that matter.
Is this suspicious: https://microsoft.c1ic.link/0B7jqd_invoice.vbs ?
I think Microsoft have their own version of this
Msn.com Office.com Sharepoint.com Hotmail.com Etc, plus all the subdomains they insert before them. It makes it very easy to create phishing emails that look plausible.
microsoftonline.com is one of my favorites. Like how can you look any more scammy :D
Firefox is freaking out on some of these. It's hilarious.
I am sharing content using these creepy links to send to office people.
Please take my upvote. :)
Haha, it's fun. Just thinking, is there some place where creepy links would be better ?
I've been at a company that internally sends out fake links that log the user and links to an educational page on internet safety.
I honestly don't mind too much since it's a once a year thing (hacktober) and honestly companies should be trying to catch out employees who click any and all links.
We used to have fun hammering millions of requests to such URLs from a VPS when they would send such emails to role mailboxes.
Eventually we got asked to please make it stop. I asked them to please stop sending fake phishing emails to robots.
why do creepy links look creepy?...
/instagram.c1ic.link/mCLIIp_free_vacation_offer.zip
Use case? Besides humor and phishing tests
Fun!
I can just say thanks
This is the best article on Wikipedia!
https://c1ic.link/bzSBpN_login_page_2
Edit: Chrome on Android warned me not to visit the site!
For humour I shortened "https://www.facebook.com/"
And got https://twitter.web-safe.link/root_4h3ku0_account_verificati...
I added google.com and it spit out https://twitterDOTc1icDOTlink/install_Jy7NpK_private_videoDOTzip
Interesting that it spit out a .zip url. Was not expecting that so I changed all the “.” to “DOT” so I don’t get punished for posting a spammy link despite this literally being a website to make links as spammy and creepy as possible.
punished by whom?
lol, I'm not clicking a .vbs link
It is hilarious and i'm not clicking any link lol.
lol
[dead]
[dead]
[dead]
[dead]
Please don't make any more URL shorteners, they are just a bad idea.
https://wiki.archiveteam.org/index.php/URLTeam
I always end up making my own, they're so simple to write.
Saves using one of the "free" ones which looks like its free but you're actually on a free trial, then you can't access your links after that trial expires.
Way to miss the point of the project!