> It has a number of URLs. I suppose it's possible at least one of the URLs is no longer legitimate.
> "These restrictions were put in place because this content violates Google Drive's Phishing policy."
> There is no indication of why the file was flagged.
Isn't your answer in the warning (albeit lacking some specificity)? If you have a bunch of URLs in there and admit not knowing about their legitimacy, it seems reasonable that at least one of them matches a database of known phishing URLs. Isn't it also reasonable to expect (and in many cases want) a company hosting publicly-shared files to notify users when content matches some heuristics for unsafe content, especially given the prevalence of phishing attempts?
That said, there may still be a case to argue that the review process, heuristic, lack of transparency, and other implementation details are flawed. But I don't have a problem with them posting a warning message on content that looks suspicious.
Complete removal, on the other hand (as in the case of OP) is another story, especially given Google's laughable process (or lack thereof) for appealing such cases.
You're right - a URL is either a technically valid URL or it isn't, but in this context, we're clearly talking about links to phishing sites (or not).
As someone who operates a service that allows people to create and share lists of links on a public site, I can tell you from experience that it's not as innocuous as you might think. Scammers routinely use a trusted domain to host a link to their malicious final destination since the initial, trusted domain in often the one given the most scrutiny (e.g., from an email). It sounds silly to more technical users who understand how the web works, but unfortunately it's effective and super popular in phishing campaigns.
To your point, yes - a list of phishing URLs would be useful in a lot of cases, but it's difficult for automated tools to tell the difference between those legitimate use cases and the much more common cases used for phishing, so they err on the side of caution. As mentioned, the human review / appeal process surely has room for improvement.
> "These restrictions were put in place because this content violates Google Drive's Phishing policy."
> There is no indication of why the file was flagged.
Isn't your answer in the warning (albeit lacking some specificity)? If you have a bunch of URLs in there and admit not knowing about their legitimacy, it seems reasonable that at least one of them matches a database of known phishing URLs. Isn't it also reasonable to expect (and in many cases want) a company hosting publicly-shared files to notify users when content matches some heuristics for unsafe content, especially given the prevalence of phishing attempts?
That said, there may still be a case to argue that the review process, heuristic, lack of transparency, and other implementation details are flawed. But I don't have a problem with them posting a warning message on content that looks suspicious.
Complete removal, on the other hand (as in the case of OP) is another story, especially given Google's laughable process (or lack thereof) for appealing such cases.