best practices for evaluating anti-spam solutions...best practices for evaluating anti-spam...
TRANSCRIPT
VB 2005, Dublin, Ireland
Best Practices for EvaluatingAnti-spam Solutions
Nathan TurajskiJamz Yaneza
Best Practices to Evaluate Anti-spam Solutions
Benchmarking Validation
• Methodologies– Accurate– Comprehensive– Fair
• Filtering Techniques– Pattern matching, Heuristics, IP blocking,
Whitelist/Blacklist, Challenge/Response,Community …..
Best Practices to Evaluate Anti-spam Solutions
Anti-spam Solutions
• Current Solutions– Software– Appliance– Services– Legislation
• Methods– Catch rate (effectiveness)– Error rate (accuracy)
Best Practices to Evaluate Anti-spam Solutions
Content Defined
• Spam– UCE, commercial bulk mail– Consumers: well defined– Enterprise: borderline
• Non-spam– Appropriate, predictable, traceable
• Graymail– Inappropriate to environment– Requires exception capability
Best Practices to Evaluate Anti-spam Solutions
Factors for Evaluating Solutions
• Primary– Effectiveness– Accuracy– Resiliency
• Secondary– Administration– Integration
Best Practices to Evaluate Anti-spam Solutions
Testing Failures
• Confused spam type classification• Non real-world environment• Short-term testing cycle• Fixed regional origins• Fixed language type• Non-relative industry• …. Etc.
Best Practices to Evaluate Anti-spam Solutions
Spam Trends
• Estimates vary, but the total amount wasusually agreed to have passed 40% by thebeginning of 2002
• Email was 50% SPAM by January of 2003
• 65% of all email was SPAM by 2004
• Almost 80% of all email is currently eitherunwanted advertising or virus-ridden
Best Practices to Evaluate Anti-spam Solutions
Evaluation Guidelines
• Valid vs. illegitimate mail– sampling over time period
spam/month
Best Practices to Evaluate Anti-spam Solutions
30% Monthly Spam Growth (2005)
Total Spam Mails Received
163,425
202,867
183,269
232,194
0
50000
100000
150000
200000
250000
March April May June
Best Practices to Evaluate Anti-spam Solutions
Evaluation Guidelines
English vs. Non-english New Spam Mails Received
March, 34%
April, 49%
June, 54%
March, 66%
April, 51%
May, 62%
May, 38%
June, 46%
EnglishNon-English
• Predominant language
Best Practices to Evaluate Anti-spam Solutions
Evaluation Guidelines
What Country does Spam like the Most?
21.42%10.34%
2.21% 2.05% 1.84%2.77%
2.97%
3.83%
3.94%
21.78% ChinaUnited StatesRepublic of KoreaBrazilJapanFranceSpainTaiwanIsraelGermany
Spam Countries
France5%
Brazil4%
Switzerland4%
Japan3%
Spain3% Germany
2%Poland
2%
Republic of Korea31%
United States20%
China26%
http://www.trendmicro.com/spam-map/default.asp
• Point of origin– broad mixed
sampling
Best Practices to Evaluate Anti-spam Solutions
Evaluation Guidelines
Spam Categories
19%
8%
14%
36%
18%
23%
3%
32%
24%
10%
8%
4% 1% Adult
Bad Samples
Commercial
Financial
Health
Others
Non-English
• Industry definitions– overlap of needs vs. excess
Best Practices to Evaluate Anti-spam Solutions
Chinese Language (traditional)
• Summary:– 38% commercial offers, 23% work related, 22%
financial, 7% sex related
Traditional Chinese(s naps ho t)
Work23%
Sexual7%
Health4%
Financial22%
Commercial38%
Other2%
Spiritual0.3%
Education4%
Work
SpiritualSexual
Health
FinancialEducation
CommercialOther
Best Practices to Evaluate Anti-spam Solutions
Chinese Language (simplified)
• Summary:– 69% commercial offers, 17% financial, 7% sex
related, 4% education
Simplified Chinese(s naps ho t)
Spiritual0.04%
Education4%
Commercial69%
Health2%Sexual
5%
Other2%
Work1%
Financial17%
Work
SpiritualSexual
Health
FinancialEducation
CommercialOther
Best Practices to Evaluate Anti-spam Solutions
German Language
• Summary:– 15% sex related, 12% commercial, 71% mixed offers
German(s naps ho t)
Sexual15%
Other71%
Financial1%
Commercial12%
Health1%
Sexual
Health
FinancialCommercial
Other
Best Practices to Evaluate Anti-spam Solutions
Evaluation Guidelines
• Timeliness– update frequency– distribution strain on network/system– correction efficiency
Best Practices to Evaluate Anti-spam Solutions
Evaluation Guidelines
• Summary– Efficiency and accuracy dependent on
spam classification and audience– Used testing samples to be valid and fixed– Overall results used for evaluation– False positive graymail vs. legitimate mail– Unmodified message delivery
Best Practices to Evaluate Anti-spam Solutions
Other Considerations
• Product configuration and tuning– Out of the box state– Vendor recommended tuning– Tolerance rating based on audience target– Long-term testing timeframe
Best Practices to Evaluate Anti-spam Solutions
Other Considerations
• Filter technique testing– Signature matching
• Focus: catch efficiency and update timeliness– Heuristic rules
• Focus: false positive rate and mitigation tools– Hybrid techniques
• Focus: accuracy and update timeliness– IP filtering
• Focus: delivery efficiency and mitigation tools
Best Practices to Evaluate Anti-spam Solutions
Other Considerations
• Performance– Deployment time– Management reporting tools– Update overheard– Message latency
Best Practices to Evaluate Anti-spam Solutions
SUMMARY
• Comprehensive evaluation includes– scalability and resiliency– long term performance– customer specific goals– exception handling– minimal administration
Best Practices to Evaluate Anti-spam Solutions
Questions?
Best Practices to Evaluate Anti-spam Solutions
Mass-mailing malware spam
• Summary:– 2003, due to Mimail, Blaster, and Sobig– 2004, due to Bagle, Mydoom, Netsky, and Sasser
Malware Tracking Center
0500,000
1,000,0001,500,0002,000,000
Janu
aryFeb
ruary
March
April
MayJu
ne July
Augus
tSep
tembe
rOcto
ber
Novem
ber
Decem
ber
Bulk-mailing Malware
# of
repo
rted
infe
ctio
ns
(uni
qe) 2002
20032004