Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
Help me install Arachni
View unanswered posts
View posts from last 24 hours

 
Reply to topic    Gentoo Forums Forum Index Unsupported Software
View previous topic :: View next topic  
Author Message
illuminated
n00b
n00b


Joined: 28 Dec 2010
Posts: 48

PostPosted: Thu Nov 24, 2011 4:57 pm    Post subject: Help me install Arachni Reply with quote

Hello there. I want to install Arachni (https://github.com/Zapotek/arachni) but I am not really sure how to do it. It needs Ruby 1.9.2 but since it's hardmasked in I'm not sure if I should compile it or not. Another option is RVM but I don't relly know how to use it. I've tried Arachni CDE package but it doesn't work.
Should I just try to compile Ruby 1.9 or is there another way I don't know about ?

Thank you
Back to top
View user's profile Send private message
krinn
Advocate
Advocate


Joined: 02 May 2003
Posts: 4151

PostPosted: Thu Nov 24, 2011 8:05 pm    Post subject: Reply with quote

First thing i would do if i were you is checking WHY ruby 1.9 is mask, 99% chance you'll get your answer then if you need to try or not using it.
Back to top
View user's profile Send private message
windex
n00b
n00b


Joined: 09 Dec 2012
Posts: 70

PostPosted: Fri Jan 18, 2013 5:10 pm    Post subject: Reply with quote

I'll see what I can do to help. First I'm emerge sqlite3 sqlite-ruby to satisfy the first requirements on my test system. Will keep you posted.

Edit: adding the ssl use flag as well as emerging:
dev-libs/libxml2 dev-libs/libxlt dev-libs/libyaml net-misc/curl

Still no word on zlib1g-dev
Back to top
View user's profile Send private message
windex
n00b
n00b


Joined: 09 Dec 2012
Posts: 70

PostPosted: Sun Jan 27, 2013 10:21 pm    Post subject: Reply with quote

[quote="windex"]
Still no word on zlib1g-dev[/quote]

Still searching. Feeling courageous and went ahead with gem install arachni. Once that is finished I'll see if there are any error messages, etc. that can give me insight into what else I need to install. I've built arachni up from source before so I should be able to get this rolling. Cheers.
Back to top
View user's profile Send private message
windex
n00b
n00b


Joined: 09 Dec 2012
Posts: 70

PostPosted: Sun Jan 27, 2013 10:23 pm    Post subject: Reply with quote

windex wrote:
windex wrote:

Still no word on zlib1g-dev


Still searching. Feeling courageous and went ahead with gem install arachni. Once that is finished I'll see if there are any error messages, etc. that can give me insight into what else I need to install. I've built arachni up from source before so I should be able to get this rolling. Cheers.
Back to top
View user's profile Send private message
windex
n00b
n00b


Joined: 09 Dec 2012
Posts: 70

PostPosted: Sun Jan 27, 2013 11:12 pm    Post subject: Reply with quote

windex wrote:


Still searching. Feeling courageous and went ahead with gem install arachni. Once that is finished I'll see if there are any error messages, etc. that can give me insight into what else I need to install. I've built arachni up from source before so I should be able to get this rolling. Cheers.


Gem complained about my ruby version, so I used eselect to choose the most recent. Ran gem install again...and it worked.

Code:

#arachni --help
Arachni - Web Application Security Scanner Framework v0.4.1.3
   Author: Tasos "Zapotek" Laskos <tasos.laskos@gmail.com>

           (With the support of the community and the Arachni Team.)

   Website:       http://arachni-scanner.com
   Documentation: http://arachni-scanner.com/wiki


  Usage:  arachni [options] url

  Supported options:


    General ----------------------

    -h
    --help                      Output this.

    -v                          Be verbose.

    --debug                     Show what is happening internally.
                                  (You should give it a shot sometime ;) )

    --only-positives            Echo positive results *only*.

    --http-req-limit=<integer>  Concurrent HTTP requests limit.
                                  (Default: 20)
                                  (Be careful not to kill your server.)
                                  (*NOTE*: If your scan seems unresponsive try lowering the limit.)

    --http-timeout=<integer>    HTTP request timeout in milliseconds.

    --cookie-jar=<filepath>     Netscape HTTP cookie file, use curl to create it.

    --cookie-string='<name>=<value>; <name2>=<value2>'

                                Cookies, as a string, to be sent to the web application.

    --user-agent=<string>       Specify user agent.

    --custom-header='<name>=<value>'

                                Specify custom headers to be included in the HTTP requests.
                                (Can be used multiple times.)

    --authed-by=<string>        Who authorized the scan, include name and e-mail address.
                                  (It'll make it easier on the sys-admins during log reviews.)
                                  (Will be appended to the user-agent string.)

    --login-check-url=<url>     A URL used to verify that the scanner is still logged in to the web application.
                                  (Requires 'login-check-pattern'.)

    --login-check-pattern=<regexp>

                                A pattern used against the body of the 'login-check-url' to verify that the scanner is still logged in to the web application.
                                  (Requires 'login-check-url'.)

    Profiles -----------------------

    --save-profile=<filepath>   Save the current run profile/options to <filepath>.

    --load-profile=<filepath>   Load a run profile from <filepath>.
                                  (Can be used multiple times.)
                                  (You can complement it with more options, except for:
                                      * --modules
                                      * --redundant)

    --show-profile              Will output the running profile as CLI arguments.


    Crawler -----------------------

    -e <regexp>
    --exclude=<regexp>          Exclude urls matching <regexp>.
                                  (Can be used multiple times.)

    -i <regexp>
    --include=<regexp>          Include *only* urls matching <regex>.
                                  (Can be used multiple times.)

    --redundant=<regexp>:<limit>

                                Limit crawl on redundant pages like galleries or catalogs.
                                  (URLs matching <regexp> will be crawled <limit> amount of times.)
                                  (Can be used multiple times.)

    --auto-redundant=<limit>    Only follow <limit> amount of URLs with identical query parameter names.
                                  (Default: inf)
                                  (Will default to 10 if no value has been specified.)

    -f
    --follow-subdomains         Follow links to subdomains.
                                  (Default: off)

    --depth=<integer>           Directory depth limit.
                                  (Default: inf)
                                  (How deep Arachni should go into the site structure.)

    --link-count=<integer>      How many links to follow.
                                  (Default: inf)

    --redirect-limit=<integer>  How many redirects to follow.
                                  (Default: 20)

    --extend-paths=<filepath>   Add the paths in <file> to the ones discovered by the crawler.
                                  (Can be used multiple times.)

    --restrict-paths=<filepath> Use the paths in <file> instead of crawling.
                                  (Can be used multiple times.)


    Auditor ------------------------

    -g
    --audit-links               Audit links.

    -p
    --audit-forms               Audit forms.

    -c
    --audit-cookies             Audit cookies.

    --exclude-cookie=<name>     Cookie to exclude from the audit by name.
                                  (Can be used multiple times.)

    --exclude-vector=<name>     Input vector (parameter) not to audit by name.
                                  (Can be used multiple times.)

    --audit-headers             Audit HTTP headers.
                                  (*NOTE*: Header audits use brute force.
                                   Almost all valid HTTP request headers will be audited
                                   even if there's no indication that the web app uses them.)
                                  (*WARNING*: Enabling this option will result in increased requests,
                                   maybe by an order of magnitude.)

    Coverage -----------------------

    --audit-cookies-extensively Submit all links and forms of the page along with the cookie permutations.
                                  (*WARNING*: This will severely increase the scan-time.)

    --fuzz-methods              Audit links, forms and cookies using both GET and POST requests.
                                  (*WARNING*: This will severely increase the scan-time.)

    --exclude-binaries          Exclude non text-based pages from the audit.
                                  (Binary content can confuse recon modules that perform pattern matching.)

    Modules ------------------------

    --lsmod=<regexp>            List available modules based on the provided regular expression.
                                  (If no regexp is provided all modules will be listed.)
                                  (Can be used multiple times.)


    -m <modname,modname..>
    --modules=<modname,modname..>

                                Comma separated list of modules to load.
                                  (Modules are referenced by their filename without the '.rb' extension, use '--lsmod' to list all.
                                   Use '*' as a module name to deploy all modules or as a wildcard, like so:
                                      xss*   to load all xss modules
                                      sqli*  to load all sql injection modules
                                      etc.

                                   You can exclude modules by prefixing their name with a minus sign:
                                      --modules=*,-backup_files,-xss
                                   The above will load all modules except for the 'backup_files' and 'xss' modules.

                                   Or mix and match:
                                      -xss*   to unload all xss modules.)


    Reports ------------------------

    --lsrep=<regexp>            List available reports based on the provided regular expression.
                                  (If no regexp is provided all reports will be listed.)
                                  (Can be used multiple times.)

    --repload=<filepath>        Load audit results from an '.afr' report file.
                                    (Allows you to create new reports from finished scans.)

    --report='<report>:<optname>=<val>,<optname2>=<val2>,...'

                                <report>: the name of the report as displayed by '--lsrep'
                                  (Reports are referenced by their filename without the '.rb' extension, use '--lsrep' to list all.)
                                  (Default: stdout)
                                  (Can be used multiple times.)


    Plugins ------------------------

    --lsplug=<regexp>           List available plugins based on the provided regular expression.
                                  (If no regexp is provided all plugins will be listed.)
                                  (Can be used multiple times.)

    --plugin='<plugin>:<optname>=<val>,<optname2>=<val2>,...'

                                <plugin>: the name of the plugin as displayed by '--lsplug'
                                  (Plugins are referenced by their filename without the '.rb' extension, use '--lsplug' to list all.)
                                  (Can be used multiple times.)


    Proxy --------------------------

    --proxy=<server:port>       Proxy address to use.

    --proxy-auth=<user:passwd>  Proxy authentication credentials.

    --proxy-type=<type>         Proxy type; can be http, http_1_0, socks4, socks5, socks4a
                                  (Default: http)



If that doesn't work try patch your build by installing from source ( http://arachni-scanner.com/latest#source ) . And if that doesn't work, msg me on this forum or post a reply. Cheers and good luck! This is an amazing tool and I have used it for some serious auditing.
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Unsupported Software All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum