¿ù°£ Àα⠰Խù°

°Ô½Ã¹° 111°Ç
   
[Domain Analyzer Security Tool] µµ¸ÞÀÎ ºÐ¼®±â º¸¾ÈÅø
±Û¾´ÀÌ : ÃÖ°í°ü¸®ÀÚ ³¯Â¥ : 2014-07-28 (¿ù) 13:39 Á¶È¸ : 5143
±ÛÁÖ¼Ò :
                             

Âü°í Url
http://domainanalyzer.sourceforge.net/
http://rajhackingarticles.blogspot.com.ar/2012/09/domain-analyzer-security-tool.html
http://www.dnspython.org/
http://dl.fedoraproject.org/pub/epel/6/x86_64/repoview/GeoIP.html


OS : CentOS 6.4 el6.x86_64
#  tar zxvf domain_analyzer_v0.8.tar.gz
# cd domain-analyzer

Crawler
# ./crawler.py -u www.aaa.com

Usage: ./crawler.py <options>
Options:
  -u, --url                            URL to start crawling.
  -m, --max-amount-to-crawl            Max deep to crawl. Using breadth first algorithm
  -w, --write-to-file                  Save summary of crawling to a text file. Output directory is created automatically.
  -s, --subdomains                     Also scan subdomains matching with url domain.
  -r, --follow-redirect                Do not follow redirect. By default follow redirection at main URL.
  -f, --fetch-files                    Download there every file detected in 'Files' directory. Overwrite existing content.
  -F, --file-extension                 Download files specified by comma separated extensions. This option also activates 'fetch-files' option. 'Ex.: -F pdf,xls,doc'
  -d, --docs-files                     Download docs files:xls,pdf,doc,docx,txt,odt,gnumeric,csv, etc. This option also activates 'fetch-files' option.
  -E, --exclude-extensions             Do not download files that matches with this extensions. Options '-f','-F' or '-d' needed.
  -h, --help                           Show this help message and exit.
  -V, --version                        Output version information and exit.
  -v, --verbose                        Be verbose
  -D, --debug                          Debug.


Domain Analyzer 
error message
You need to install python-dnspython. apt-get install python-dnspython

# wget http://dl.fedoraproject.org/pub/epel/6/x86_64/GeoIP-1.5.1-5.el6.x86_64.rpm
# rpm -Uhv GeoIP-1.5.1-5.el6.x86_64.rpm
# wget http://www.dnspython.org/kits/1.11.1/dnspython-1.11.1.tar.gz
# cd dnspython-1.11.1
# python setup.py install
# ./domain_analyzer.py -d naver.com


sage: ./domain_analyzer.py -d <domain> <options>
options:
  -h, --help                            Show this help message and exit.
  -V, --version                         Output version information and exit.
  -D, --debug                           Debug.
  -d, --domain                          Domain to analyze.
  -L <list>, --common-hosts-list <list> Relative path to txt file containing common hostnames. One name per line.
  -j, --not-common-hosts-names          Do not check common host names. Quicker but you will lose hosts.
  -t, --not-zone-transfer               Do not attempt to transfer the zone.
  -n, --not-net-block                   Do not attempt to -sL each IP netblock.
  -o, --store-output                    Store everything in a directory named as the domain. Nmap output files and the summary are stored inside.
  -a, --not-scan-or-active              Do not use nmap to scan ports nor to search for active hosts.
  -p, --not-store-nmap                  Do not store any nmap output files in the directory <output-directory>/nmap.
  -e, --zenmap                          Move xml nmap files to a directory and open zenmap with the topology of the whole group. Your user should have access to the DISPLAY variable.
  -g, --not-goog-mail                   Do not use goog-mail.py (embebed) to look for emails for each domain
  -s, --not-subdomains                  Do not analyze sub-domains recursively. You will lose subdomain internal information.
  -f, --create-pdf                      Create a pdf file with all the information.
  -l, --world-domination                Scan every gov,mil,org and net domains of every country on the world. Interesting if you don't use -s
  -r, --robin-hood                      Send the pdf report to every email found using domains the MX servers found. Good girl.
  -w, --not-webcrawl                    Do not web crawl every web site (in every port) we found looking for public web mis-configurations (Directory listing, etc.).
  -m, --max-amount-to-crawl             If you crawl, do it up to this amount of links for each web site. Defaults to 50.
  -F, --download-files                  If you crawl, download every file to disk.
  -c, --not-countrys                    Do not resolve the country name for every IP and hostname.
  -C, --not-colors                      Do not use colored output.
  -q, --not-spf                         Do not check SPF records.
  -k, --random-domains                  Find this amount of domains from google and analyze them. For base domain use -d
  -v, --ignore-host-pattern             When using nmap to find active hosts and to port scan, ignore hosts which names match this pattern. Separete them with commas.
  -x, --nmap-scantype                   Nmap parameters to port scan. Defaults to: '-O --reason --webxml --traceroute -sS -sV -sC -PN -n -v -F' .
  -b, --robtex-domains                  If we found a DNS server with zone transfer activated, search other UNrelated domains using that DNS server with robtex and analyze them too.
  -B, --all-robtex                      Like -b, but also if no Zone Transfer was found. Useful to analyze all the domains in one corporative DNS server. Includes also -b.
Press CTRL-C at any time to stop only the current step.


À̸§ Æнº¿öµå
ºñ¹Ð±Û (üũÇÏ¸é ±Û¾´À̸¸ ³»¿ëÀ» È®ÀÎÇÒ ¼ö ÀÖ½À´Ï´Ù.)
¿ÞÂÊÀÇ ±ÛÀÚ¸¦ ÀÔ·ÂÇϼ¼¿ä.
   

 



 
»çÀÌÆ®¸í : ¸ðÁö¸®³× | ´ëÇ¥ : ÀÌ°æÇö | °³ÀÎÄ¿¹Â´ÏƼ : ·©Å°´åÄÄ ¿î¿µÃ¼Á¦(OS) | °æ±âµµ ¼º³²½Ã ºÐ´ç±¸ | ÀüÀÚ¿ìÆí : mojily°ñ¹ðÀÌchonnom.com Copyright ¨Ï www.chonnom.com www.kyunghyun.net www.mojily.net. All rights reserved.