On 22 Mar 2026, at 14:13, Beni Keller via swinog swinog@lists.swinog.ch wrote:
[..]
$ curl -6 https://sbb.ch -v
- Host sbb.ch:443 was resolved.
- IPv6: 2a00:4bc0:ffff:9::c296:f58e
- IPv4: (none)
- Trying [2a00:4bc0:ffff:9::c296:f58e]:443...
- connect to 2a00:4bc0:ffff:9::c296:f58e port 443 from
2001:8e0:1426:1:47d6:12fd:bc19:5704 port 55992 failed: Connection timed out
Note that it is says "timed out", which means it did connect.
Do you have an odd MTU maybe? or some other MTU related issues.
They got a fun WAF thing, thus even a mere 'wget' gets rejected.
But I recall in the past that they also had MTU issues which is more likely your problem.
At the moment and in my usage in last years though it has just worked over IPv6.
Definitely run a wireshark in the background and then open the website in a normal browser and check what packets you see there; that might just give you the hint that some packets go out and others do not come back.
Greets, Jeroen
% wget https://www.sbb.ch/robots.txt --2026-03-23 09:08:10-- https://www.sbb.ch/robots.txt Resolving www.sbb.ch (www.sbb.ch)... 2600:9000:20a5:6600:2:5597:5ac0:93a1, 2600:9000:20a5:8800:2:5597:5ac0:93a1, 2600:9000:20a5:4800:2:5597:5ac0:93a1, ... Connecting to www.sbb.ch (www.sbb.ch)|2600:9000:20a5:6600:2:5597:5ac0:93a1|:443... connected. HTTP request sent, awaiting response... 403 Forbidden 2026-03-23 09:08:10 ERROR 403: Forbidden.
Sending a 403 on a robots.txt as one wants to block robots will just have the adverse effect of robots having no robots.txt guidance and.... guess what: robots.txt is empty thus they will proceed with their fun....
% wget -U "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/146.0.0.0 Safari/537.36" https://www.sbb.ch/robots.txt --2026-03-23 09:07:55-- https://www.sbb.ch/robots.txt Resolving www.sbb.ch (www.sbb.ch)... 2600:9000:20a5:6600:2:5597:5ac0:93a1, 2600:9000:20a5:8800:2:5597:5ac0:93a1, 2600:9000:20a5:4800:2:5597:5ac0:93a1, ... Connecting to www.sbb.ch (www.sbb.ch)|2600:9000:20a5:6600:2:5597:5ac0:93a1|:443... connected. HTTP request sent, awaiting response... 200 OK Cookie coming from www.sbb.ch attempted to set domain to sbb.ch Length: 2984 (2.9K) [text/plain] Saving to: ‘robots.txt’
robots.txt 100%[===========================>] 2.91K --.-KB/s in 0s
2026-03-23 09:07:55 (41.8 MB/s) - ‘robots.txt’ saved [2984/2984]
btw: User-agent: Mediapartners-Google Disallow: # auf allen Seiten erscheint Werbung
Really.... never seen that because of a simple Web-User-Agent-WAF called an ad blocker ;)
User-agent: * Disallow: # alles darf indexiert werden
Disallow without a path will not make everything be allowed, it is magic that SBB shows up in search engines at all... guess as that is a broken thing, most robots just ignore it.
But bit odd to block things based on UA in the WAF level but then not actually have explicit rules in robots.txt.... oh well ;)