offensive (web, etc) testing framework: my gift for the community - berlinsides 2011
DESCRIPTION
Introduction to the Offensive (Web, etc) Testing Framework Demos: http://www.youtube.com/playlist?list=PL1E7A97C1BCCDEEBB&feature=plcp Download as PDF if fonts look funny.TRANSCRIPT
Offensive (Web, etc) Testing Framework My gift for the community
Abraham Aranguren@7a_
[email protected]://7-a.org
Berlin Sides, December 29th 2011
Agenda• About me
• Lessons from:
� OSCP
� Experience
� Chess Players
• OWTF vs Traditional + Demos
• Conclusion
• Q&A
About me• Spanish dude
• Degree + Diploma in Computer Science
• Uni: Security research + honour mark
• IT: Since 2000 (netadmin / developer)
• Comeback to (offensive) security in 2007
• OSCP, CISSP, GWEB, CEH, MCSE, Etc.
• Web App Sec and Dev/Architect
• OWTF, GIAC, BeEF
What is OSCP?• Certification run by Offensive Security *
*Offensive Security maintain the Backtrack distro
100% practical exam:
• 24 hour hacking challenge
• Few pass the 1st time
• Experienced pen testers have failed this
http://www.offensive-security.com/information-security-certifications/
Lessons from OSCPBackground: Nessus, etc were forbidden, scripts ok.
Approach to get a 100% score:• Understand + script everything• Make scripts reliable (!babysitting)• Make scripts staged (results in < 10 mins)• Scripts find vulns in background• Scripts present information efficiently
The test taker is now:• Fresh to analyse info + exploit vulns
• Using more time to think
Lessons from OSCP cont.Others spent valuable energy to run (a lot of) tools by
hand … I had this in < 10 minutes via scripts!:
Lessons from OSCP cont.Newer results merged via script with exploitation notes,
etc:
Lessons from ExperiencePen testers vs Bad guys• Pen testers have time/scope constraints � Bad guys don’t• Pen testers have to write a report � Bad guys don’t
Complexity is increasingMore complexity = more time needed to test properly
Customers are rarely willing to:“Pay for enough / reasonable testing time“
A call for efficiency:• We must find vulns faster• We must be more efficient• .. or bad guys will find the vulns, not us
Lessons from Experience cont.
Ways to beat time constraints:• Test ahead of time (i.e. Silent testing)• Automate as much as possible (i.e. Scripting)• Efficient testing (i.e. Scripting/Analysis)• Efficient reporting (i.e. Templates/Scripting)
Learning from Chess Players
Image Credit: http://www.robotikka.com / Terra
Chess Complexity
Image Credit: http://chessok.com
Efficient Chess AnalysisChess players have time constraints like Pen testers.
From Alexander Kotov - "Think like a Grandmaster":
1) Draw up a list of candidates moves
2) Analyse each variation once and only once
3) Having gone through step 1 and 2 make a move
1) Draw up a list of candidate paths of attack
2) Analyse tool output once and only once
3) After 1) and 2) exploit the best path of attack
Ever analysed X in depth to only see “super-Y” later?
Chess Openings
Image Credit: http://chessok.com
Chess Player approach
Chess players:• Memorise openings• Memorise endings• Memorise entire lines of attack/defence• Try hard to analyse games efficiently
Pen tester translation:• Chess players precompute all they can• Chess players analyse info only once
Garry Kasparov vs Nigel ShortWorld Championship Match 1993
“Kasparov was evidently disoriented as he used 1 hour 29 minutes to Short's 11 minutes(!) for the entire game.“� Short (weaker) was 8 times faster
“In just 9 days after facing it for the first time …Kasparov and his team had found the best reply(11.Ne2 ) and even succeeded in completelybamboozling Short with 12.Be5: <This move was a surprise for me. I spent 45 minutes on my reply. I could not fathom out the complications …- Short“
http://www.chessgames.com/perl/chessgame?gid=1070677
http://www.chessgames.com/perl/chessgame?gid=1070681
Can we be more efficient?Can tools, knowledge and human analysis
be coordinated like an army?
Image Credit: http://pakistancriminalrecords.com
OWTF Process Demos (1+2)
Image Credit: http://www.amamavas.com
OWFT vs Traditional: Disclaimer
Existing tools:• Are great at what they do• Solve difficult problems• Their authors are typically very smart people!• Made OWTF possible
Not all limitations covered next apply to all tools
Define once + Automate
Too many tools to run manually
Traditional
All tools are run for you automatically
Figure out how to call the tool each time
Define how to call each tool only once
Figure out how to overcome poor defaults (i.e. UA)
Useful defaults + Easy to run
poor defaults sometimes hard-coded in the code!
Demo 3Define + Automate
Comprehensive
Remember tests to run
Traditional
Tests are run automatically
Remember tools/websites to perform each test
Use of best known tools + websites
Remember best order to run tools / use sites
Calls tools/sites in the best known order
Implements tests not found on other tools
Demo 4Comprehensive
Staged Report + Vuln Stats
No report until end of scan� waste of time
Traditional
You have a partial report in < 5 seconds
Report vulnerabilities 1 by 1 � waste of time
Refresh report = New results are highlighted
Reports vuln stats, which you can drill on
Cannot analyse + exploit concurrently
Fresh to analyse + exploit concurrently
Demo 5Staged Report
Dynamic Report, flags, notes, etc.
Report is static + poor interaction
Traditional
Report is dynamic + interactive
Cannot flag / rate / ignore findings
Cannot take notes / filter findings with your criteria
Can flag / rate / ignore findings
Can take notes / filter findings with your criteria
Pen tester can import / export reviews
Demo 6Import / Export
Review
Reliable + Partial results if crashed
Require babysitting (i.e. did it crash/stop?)
Traditional
Limited babysitting required (i.e. often none)
Lose all results + no report if crashed
Poor exception handling = crashes happen
Tries hard not to crash + save results if crashed
Tool or plugin crashed? � save data + continue
Robust exception handling (I think ☺)
Demo 7Exception Handling
Cancel + Move on support
Stuck / Crashed command � no report
Traditional
Stuck / Crashed plugin � no report
Stuck? � Control+C + saves data + moves on
Crashed? � Moves on (“finished”) + saves data
You can Control+C commands, plugins and owtf
Stuck / Crashed tool � no report
When Control+C: Choose next cmd / plugin / exit
Demo 8Cancel + Move on
Support
Aligned to Standards
Not OWASP Testing Guide aligned
Traditional
Not PTES aligned
OWASP Testing Guide aligned
PTES alignment-coverage planned
Narrow standard coverage
Extensive standard coverage
Demo 4OWASP Testing Guide
Aligned
Simulation + Silent testing support
No “Simulation mode” � Run and see (!)
Traditional
Cannot start test without permission (usually)
Supports “Simulation mode” � 1st see, then run
Can test without permission: Silent testing support
Passive, semi passive, active test separation
No passive, semi passive, active test separation
Test ahead of time = More efficiency
Demo 9Simulation + Silent testing
Support
Language agnostic, easy to extend
Language dependent (ruby, python, perl, etc.)
Traditional
Cannot contribute in your language (usually)
Language agnostic: if the shell can run it = WIN
Contribute in your language (best if CLI-callable)
Difficult to extend / share info
Easy to extend / share info
Easy setup and greppable DB
Hard to setup: libraries, gems, DB installs, etc
Traditional
DB in obscure format
Easy to setup: copy dir + run
DB in plain text, links provided to everything
DB is easy to grep for custom searches
Cannot custom search DB
Demo 10Greppable DB
Chess-like analysis support
Cannot pre-compute / define tests (self/other)
Traditional
Cannot mark “best candidate moves”
Tests are pre-computed / defined (self + other)
Mark “best candidate moves” via flags
Mark as analysed via strike-through
Cannot analyse each option only once + !notes
Filter your analysis with your priorities + notes
Demo 11Chess-like analysis
Support
What about Tactical Fuzzing?i.e. Burp, ZAP, etc
Some tools do not support outbound proxies (!)
Traditional
Can only pass their own info to the tactical fuzzer
Can scrape results from all tools run
Can pass scraped results to tactical fuzzer
Proxy ok when multiple tools used under the hood
Messy proxying when multiple tools are used
Proxy ok even if tool called has no proxy support
Demo 12Outbound Proxy
Google Hacking without API keys
Some GH tools require API keys to work
Traditional
Others require you to break CAPTCHA (!)
No API keys required
No CAPTCHA breaking required
Use of tunneable blanket searches instead
“Open all in tabs” for ease of use ☺
Demo 13Google Hacking
without API Keys
OWTF > Running tools
Focused on small problems
Traditional
Missing a lot from the OWASP Testing Guide
Calls the “best tool for the job” when possible
Implements many tests on its own too!
Links for test sites / “Suggestions”
Must find X number of tools to bridge the gap
Custom template support planned for reporting
Demo 14OWTF tests without
external tools
Demo 15Aux Plugin intro
Phising
Demo 16DoS
OWTF Considerations/Limitations
• Relies on existing great tools != replacement• Developed on python 2.6.5• CLI Linux-only (dev on Backtrack 5 R1)• GUI Multiplatform (web page)• Lots of bugs (but stable! ☺)• Lots of features in my todo list! ☺• Not a “script kiddie tool” + Not a silverbullet• Does not try to rate severity/replace humans:• Focus is to provide data efficiently for the pen
tester
OWTF Target User base
Who is this for?
OWTF:Not for Nessus Monkeys
Image Credit: Steve Lord, BSides London 2011
OWTF: Import/Export ReviewsJaded Cynic compatible
Image Credit: Steve Lord, BSides London 2011
OWTF Goal:Bring you closer to this
Image Credit: Steve Lord, BSides London 2011
OWTF – I need your help
Licence?• 3-clause-BSD (metasploit)• GPL v3 / v2, Apache• Other?
Hosting service?• github (metasploit, BeEF, whatweb, …)• googlecode• sourceforge• Other?
OWTF - I need your helpTool authors: Can owtf run your tool better?
Pen testers / Python wizards:• What is missing? (tools, resources, approach,..)• What could be done better?
Web designers:• Make the report look better / easier to use
JavaScript gurus:• More ideas to improve interactive report
Regexp and Selenium gurus:• To suggest better Regexps and/or approach
ConclusionOWTF aims to make pen testing:• Aligned with OWASP Testing Guide + PTES • More efficient• More comprehensive• More creative and fun (minimise un-creative
work)
This way pen testers will have time to:• Focus on sharing information (tools, techniques, ..)• Think out of the box for real (!babysit, !stupid
work)• Chain vulnerabilities like attackers do• Really show impact so that risk is understood
Special thanks to
For getting me started:Justin Searle: “Python Basics for Web App Pentesters” –
OWASP AppSec EU 2011
For showing what I was missing in my process:Jason Haddix: “The Web Application Hacking
Toolchain” – BruCon 2011
For “do what you love” inspiration:Haroon Meer: “You and your research” – Brucon 2011
Special thanks to
• OWASP Testing Guide + PTES contributors• Andrés Riancho • Marcus Niemietz• Mario Heiderich• Michele Orru• Sandro Gauci
Q&AAbraham Aranguren
http://7-a.org
Website: http://owtf.org/
Twitter: @owtfp
Project info