Posted by: David Harley | February 28, 2013

The drawbacks of fuzzy filtering

Robert Slade drew my attention to an article on Apple’s Deleting iCloud Emails That Contain The Phrase ‘Barely Legal Teens’. We can probably all see how that phrase might suggest pornography, even child-abusive pornography, though as one comment to the article noted, ‘barely legal’ does seem to suggest legal, not illegal. But it came to light as a result of the constant silent trashing of a legitimate script attached to email sent by a screenwriter to a director. And indeed it appears that “Apple reserves the right to remove any content at any time that it feels is objectionable, without telling you that they’re going to delete it.”
(See the iCloud Terms of Service.)

But is that what happened here? Jordan Merrick isn’t so sure: Apple’s filtering iCloud emails? Probably not. I’m not in a position to test that service, so I’m reserving judgement. However, it seems to me that some of the protests about Apple’s ‘censorship’, if that’s what’s really happening, miss the point. The silent trashing of legitimate mail is unpleasant, but it’s been with us a long time. An article I wrote for Virus Bulletin back in 2006 looks at the furore caused when Verizon appeared to be “rejecting mail by IP block resulted in the loss of all mail from large portions of Europe and Asia.” And I make an oblique reference to the now-defunct rfc-ignorant.org, which at one time blacklisted  domains such as *.nhs.uk (the UK’s National Health Service) and the country-level TLD .de (Germany!) for perceived breaches of RFC compliance.

Much of the time, though, the problem isn’t simple arrogance, or censorship, but automation. The history of spam and malware filtering is littered with filtering criteria that resulted in unanticipated side-effects:

  • Word files detected as malware because they contained the text of the EICAR test file.
  • Messages discarded because they contained the same text in the subject field as some malicious messages.
  • Messages discarded because they contained inoffensive words that themselves contained a potentially offensive substring (Scunthorpe Syndrome).
  • A security product that quarantined all emails containing the letter ‘p’.

Not only message filters but many other layers of security are dependent on automated processing and heuristics that don’t catch everything malicious and don’t spare everything  that isn’t malicious. Yes, you can apply that to anti-malware at least as accurately as you can to anti-spam measures. Not surprisingly, since those are overlapping technologies in many modern products. Sadly, the 100% perfect solution exists only in marketing hype.

Still, this is one scenario that desktop AV can’t be blamed for. According to Infoworld, Steven G., the unfortunate scriptwriter, “being a Mac user … doesn’t use antimalware software (of course).” Sorry about that: for a moment it slipped my mind that there is no Mac malware, never was, never could be… Sigh…

David Harley CITP FBCS CISSP
Mac Virus/Small Blue-Green World/Anti-Malware Testing
ESET Senior Research Fellow


Responses

  1. Reblogged this on The Real Nirv and commented:
    Apple is filtering emails that you send but this is nothing new actually. It’s in their icloud terms of service. They have given themselves the option of blocking material that you send by blocking it from even getting to the destination, so your recipient will not see your email in the junk mailbox, it’s simply stopped.

    You can test this by creating an email to yourself and from yourself, try it. The email addresses for the to and fro have to be an @me.com, @mac.com or @icloud.com domain. If in the subject field or in the body of the message you include something like, “Barely legal teens” it will take a while to go out and then it appears to go out but will not come back.

    You can read Apple’s iCloud terms of service here: http://www.apple.com/legal/icloud/en/terms.html

    You will notice that in the section titled;

    Your Conduct

    You agree that you will NOT use the Service to:

    in particular, subsection f appears to explain why such an email can not go out.

    f. post, send, transmit or otherwise make available any unsolicited or unauthorized email messages, advertising, promotional materials, junk mail, spam, or chain letters, including, without limitation, bulk commercial advertising and informational announcements;

    • Those points are made either in the article or in the links included. My point isn’t really about censorship – not being American, I don’t think my rights are infringed if Internet traffic is filtered within reason. 🙂 Rather, that there’s too much traffic to filter without automation, and automated processes make mistakes a human observer wouldn’t.

      • I wanted to highlight a specific aspect of Apple’s approach, that their service is offered within a certain conduct as outline in their service agreement. I agree that automating a process, like email filters, can put into jeopardy content that is not necessarily in the spirit of Apple’s service agreement. That being said, and I most certainly see your point on human observation, it does seem to be as balanced as it might be. Granted we are not in a position to evaluate their work-flow on how they filter but in consideration of how many users they have on iCloud it would be a daunting task and prohibitively expensive to source this to human based analysis.

        Personally, I was unaware of Apple’s approach until I read about it a few days ago and am pleasantly surprised that Apple just blocks such material from being delivered to me. They could have taken the approach of sending it anyway and then it landing in my junk folder as spam and allowing me the opportunity but this would run the risk of their users, who can be Windows or Mac based to find malware attached.

        Not being American either I personally and preferably prefer to view this as part of a good security policy and not censorship per se. Unfortunately, from a monetary position, Apple can’t employ heads for the task of tagging email, nor does the industry at large have the capacity to do so. In hindsight, Apple could allow customers to do one of a few things, allow emails from specific recipients to pass through without filtering and or allow emails in general to arrive in the junk mailbox folder if they would otherwise hit the tag.

        Ultimately though I would conclude by simply saying that Apple is hosting the service and it’s free. As the host of a service, it generates a community and within the bounds of that community, it lays down rules that it deems appropriate within the law and within general internet etiquette. Where the law is concerned, Apple has no choice but to position the bounds of the service but within it. Apple as a public company is also bound to position the service to reflect it’s code of conduct or ethical demeanour.

        I sincerely appreciate the dialogue. It was interesting to revisit this topic and reflect on what it means to the user. I am sure I could have left something to be desired. Please leave me any further comments, I’ll be happy to digest them…take care 🙂

  2. Sorry for the typos above, too rushed I suppose.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Categories

%d bloggers like this: