Technically, wicar.org is safe and should not be blocked by Google, while malware.wicar.org does not matter so much as it does indeed host dangerous content.
The purpose of WICAR is to be able to safely test network defences. For example, a corporation might have firewall IPS signatures, then AV scanning HTTP proxy, then content filter, then desktop AV. Users can visit the WICAR website to see where their security controls “see” the threat and choose to block. Or if they fail, the test may actually reach a desktop and execute. Given the possibility, it is desirable to have WICAR safe payloads which execute calc.exe instead of testing from a public blacklist of actual infected websites and e.g. cryptolocker ransomware that might infect the organisation if newly deployed security technology doesn’t work as expected.
However the malware host must continue to host fake payloads, and SSL via LetsEncypt was added so that visitors can also test their SSL-inspection anti-malware defences given SSL and TLS is often abused to deliver browser attacks which sneak past IDS/IPS/AV which cannot strip SSL for analysis and (thanks to LetsEncrypt!) is becoming the norm.
Many thanks for the responses - first post, but obviously a great community!
@stevenzhu I agree with all points. Happy to use a commercial CA too - I’m assuming this eventuality and I’m sure they won’t care (which it is good that LS does some basic checks!). But the inability to bypass evidently has some unusual flaws such as this. I am also happy to let the certificate expire - but my issue is that an organisation might have a strict browser policy, where they cannot bypass an expired cert. Yet still want to test their anti-malware technology and sysadmins saying “we are immune” to management merely due to an expired cert doesn’t validate that in-line appliances / signatures / heuristics are working.
I did follow up with Google and await human response but doubtful. Safe Browsing marks them as “Send visitors to harmful websites” which is amusing, because the domain is okay and the subdomain says the same yet actually hosts the questionable content. Seems like a failure of automated classification.
An alternative solution might be to register an additional domain, so that clean / unclean are unique and not flagged by wildcard inheritance which appears to have occurred here.
The other question is whether EICAR should be flagged or not, vs WICAR? EICAR is well known globally, but working AV and Googlebot should still list them as a problem? I suspect they are exempted manually by vendors.
FWIW, wicar.org being blocked is a good thing™. But it has taken years for Google / Chrome / AV to catch up. As a newly registered URL which hadn’t been classified, it was surprising/worrying how many desktop AV solutions either didn’t notice or only picked up the HTML files when written to disk… which for large corporations protecting thousands of users is the last thing you want to see after tens of millions have been invested using cutting edge packet inspection and in-line appliances. It demonstrated that HTTP(S) defence is primarily URL or domain blacklisting alone, rather than malicious content inspection and won’t protect in a real world scenario from a skilled adversary using selective new domain registration to achieve compromise.
Very interesting case @aushack! Thanks for bringing it to our attention. Presently there is no mechanism in Boulder (the Let’s Encrypt CA software) to exempt domains from the Google Safebrowsing check. I think the bar to justify building this into Boulder will be pretty high. We’re generally nervous about features that aren’t broadly applicable (this is a pretty narrow corner case) or that change decisions in the issuance pipeline.
@aushack Do you think there is value in having wicar.org considered a true positive by Google Safe Browsing, or is the challenge simply finding someone to get it removed? It seems to me that its perhaps better to work on getting GSB to consider this a benign website and let subsequent systems make a determination based on the content of the site. If you agree we can try to work together to escalate this with a human being inside of Google. Out of curiosity, when did you submit your follow-up with Google? Has it been without a reply for more than a week?
I’ll discuss this case with the other Boulder engineers this afternoon & see what other folks think.
The consensus here was that this should be addressed by GSB. If that doesn’t work out I think your best path forward will be acquiring a certificate from a CA that has the staffing/process in place to handle exceptional circumstances with manual overrides. I know that’s an unfortunate outcome so I’m hopeful that Google Safe Browsing would consider your request to be whitelisted.
I originally opposed the GSB block via the automated form with details which was rejected 24-48 hours later. In that case I suspect they maybe didn’t read / understand so maintained the block. I’m sure the GSB code just checks if the URLs are still 200 / unchanged hash and default to maintain the block.
I sent an email to Google as well, but no response yet. If you know someone that is willing to take a few minutes to actually consider the situation, that would be really great.
Bypassing HTTPS wasn’t necessary for Malwarebytes to trigger. One should also know that many “SSL inspection” packages, even if not used to deliberately spy on end users, often have various shortcomings, like protocol downgrades, failure to properly verify certificates, etc.