Вы находитесь на странице: 1из 15

April, 25th 2013

06/2013
1

April, 25th 2013

Dear Readers,
Web application security is a branch of Information
Security that deals specifically with security of
websites, web applications and web services. We
would like to show you the technical side of this area
of expertise. We are sure that after reading this issue
you will improve your skills and find out a lot of
security methods.
The whole issue concerns of 7 articles:

How to Estimate Development Security


Maturity

Paros Proxy

uPrint.. iHack..

Web App Security - Basic Understanding

XSS - A powerful way of exploiting Web


based Applications

Web App Pentesting Methodology

Next-generation
monitoring

SOC

environments

You will have a chance to read about 5 of the most


relevant terms concerning application security:
Asset.
A resource of value such as the data in a database or
on the file system, or a system resource.
Threat.
A negative effect.
Vulnerability.
A weakness that makes a threat possible.
Attack .
An action taken to harm an asset.
Countermeasure.
A safeguard that addresses a threat and mitigates risk.

Enjoy the hacking!


Hack Insight Team

[Hack]in(Sight)
Editorial Section:
Authors:
Renato
Vikas
Kumar,
Augusto,
Miroslav
Francisco
Ludvik ,
Pragati Ogal
Caballero,
Matthew
Rai, Krunoslav
Clapham,
Rukavina,
Rahul
Jamgade,
Jon Zeolla.
Girish Kumar,
Ahmed Rasul, Sameh Sabry.
Copy-editors:
Copy-editors:
Agata Brzozowska, Manish
Chasta, Dhawal
Manish
Chasta, Dhawal
Desai, Kevin
Desai,
McIntyre,
Kevin
McIntyre,
Robrecht
Robrecht
Minten ,
Zsolt Nemeth,
Minten,
Zsolt Nemeth,
Phil Quinan,
Phil Larry
Pool, David
Quinan,
Larry
Sanborn
Pool, David
(Axiom),
Andy Stern.
Sanborn
(Axiom), Andy Stern.
DTP:
Anandu
Jim
Steele
Ashokan
Publisher:
Hack Insight Press Pawe Pocki
www.hackinsight.org
Editor in Chief:
Pawe Pocki
p.pawel@hackinsight.org
All trademarks presented in the magazine
were used
only for informative purposes.

06/2013
3

Table

Of

Contents

www.hackinsight.org
How to Estimate Development Security Maturity
Page 5: Seen the security design or development failures of middleware,
antivirus, browsers, and other technologies lately? Worrying about inheriting
other peoples security failures? How can anyone avoid others design flaws?
With a more secure design, of course!
Paros Proxy
Page 15: In the age we are in, technology has infiltrated the fabric of society,
and in turn, our everyday lives. From social media to communication to
business, technology is now part of our lives. Security has always been a
concern for all human beings, and technology is no different. As with any
form of security, any vulnerability that allows an attacker to breach a system
is a serious threat.
uPrint.. iHack..

Page 22: I print and you hack? Is that what you are trying to say Sameh? Am I
in danger having a printer close to me?The answer to the above-mentioned
questions is YES.

Web App Sec - Basic Understanding


Page 26: Information security deals with securing information from getting
leaked, destroyed etc. to get the information attacker try to get into
organization through network , internet application etc.

XSS
Page 37: Recently most of the websites are dynamic websites that provides
user specific information as per the profile and other settings of the user. It is
different as compared to static websites that shows same contents to all the
users who visits the site.
Web App Pentesting Methodology
Page 50: In this article, we will show the phases and most important testing when
performing a vulnerability assessment on a Website.

Next-generation SOC environments monitoring


Page 60: I would like to start this with descriptive phrase from the famous
military strategist Sun Tzu (the art of war), which I believe to fit perfectly for
our IT security challenges to fend of cyber war, cyber attacks and cyber
terrorism.

In this article, we will show the phases and most important testing when performing a vulnerability
assessment on a Website.

Introduction
The goal of this article is to explain how to use a variety of tools to carry out a vulnerabilities analysis of a
website; and to introduce various methods to identify and exploit vulnerabilities.
To do this, we will divide the article as follows: Information gathering, automatic testing (scanners) and
manual testing. Note: In this article we do not discuss source code assessment another phase in the Web
Application Pentesting Methodology. To carry out this task and others described in this article, we
recommend that the readers revise "The Open Web Application Security Project (OWASP 1)".
In order to follow the steps in this article and try the various techniques to identify vulnerabilities it is
recommended that one of the following frameworks be installed: DVWA2, Web Security Dojo3 or Web For
Pentester4.
Note: Remember that any technique or launch of an intrusive tool on a Website without authorization is a
crime in most countries.

Information
Gathering
Extract Metadata
Creation of
Dictionary
Download Website
Online tools
Identification of
Email accounts
Identification of
Virtual Host

Automatic Testing
(scanners)
Launch Tools (Free &
commercial)
Spidering
Interesting files
Bruce force folders
and files
Fuzzing

https://www.owasp.org/index.php/Main_Page
http://www.dvwa.co.uk/
3
http://www.mavensecurity.com/web_security_dojo/
4
https://pentesterlab.com/web_for_pentester.html
2

Manual Testing
Testing
vulnerabilities
Surfing Website
Identify componets
& plugins
Headers, http
methods, sessions,
certificates, etc.
Manipulation of
parameters
Analysis of Flash,
Java and other files.
Authentication
system

April, 25th 2013

Information gathering
In this first phase we will try to identify as much information as possible, after we try to carry out more
complex and specific attacks, toward the applications analyzed.
A good starting point is to get to know the target Website as much as possible, by downloading its
structure, files, and any other relevant information. To accomplish this task, we use the following tools:
wget, httrack5
./wget rck http://<WEBSITE>
Note: -r recursive mode, -c continues downloading and k after the downloaded, modify the links to
routes to surf correctly.
If the Website analyzed is available online, we can speed up some information collected usually in the
manual testing stage such as HTTP headers and identification of several CMSs and their versions (because
the metadata extraction is based on whatweb software) and assess its level of security based on this data.
To accomplish this task, there is an online tool on the Internet desenmascara.me6:

Another task that we perform in this phase will be to identify the maximum number of email accounts, in
order to have valid user names in different application areas. To accomplish this task, we could use the
following tools: theharvester7, maltego8, msfcli (metasploit9) among others.
./theharvester.py -d <WEBSITE> -l 500 -b google
* Note: In this example we limit to 500 the google search.
With msfcli (metasploit):
./msfcli modules/auxiliary/gather/search_email_collector DOMAIN=<WEBSITE> E > ouput_emails.txt
Also, we need to perform a search for documents within the Website, in order to identify those that
contain metadata and thus able to perform an extraction of them for more information such as user

http://www.httrack.com/
http://desenmascara.me/
7
https://code.google.com/p/theharvester/
8
http://www.paterva.com/web6/products/maltego.php
9
http://www.metasploit.com/
6

06/2013
51

names, versions, internal devices etc. To accomplish this task, we use the following tools: Metagoofil10,
FOCA11.
./metagoofil.py d <WEBSITE> -t doc,pdf l 200 n 50 o domainfolder f output_files.html
Note: In the above example, we limit the search to 200 per search, with a maximum of 50 files to
download and save the files in the directory domainfolder.
This task can also be performed manually through a google search, as indicated in the following example:
site: www.url.com ext:pdf intitle:"Documents and settings"
Besides searching metadata, email accounts and downloading the site, we try to identify the different
interfaces of Website management.
One task that would be very interesting to do, is create a custom dictionary of the Website. To carry out
this task, we can use the script cewl.rb12
./cewl.rb depth 2 min_word_length 5 write output_word_list.txt http://<WEBSITE>
Note: In the above example, with the --depth option you specify the kind of depth, in this case 2 and also
specify the minimum length of 5 characters, -- min_word_length.
To complete this phase, we will perform a search for other domains that are hosted on the same IP (virtual
host). To accomplish this task, we can use the tool: revhosts13
./revhosts pig vhh <IP_WEBSITE>
Similarly, one could perform this same search through Bing, with the following request:
IP:<IP_Website>

Automatic Testing (Scanners)


In this phase, we will identify the largest number of vulnerabilities with free tools like Nikto 14, w3af15,
skipfish16, Arachni17, ZAP18 or some commercial tools as Accunetix,19 AppScan20, WebInspect21,
10

http://www.edge-security.com/metagoofil.php
http://www.informatica64.com/foca.aspx
12
http://www.digininja.org/projects/cewl.php
13
http://securitytnt.com/revhosts/
14
http://www.cirt.net/nikto2
15
http://w3af.org/
16
https://code.google.com/p/skipfish/
17
http://arachni-scanner.com/
11

April, 25th 2013

Netsparket22, among others.


This phase is important as it provides some advantages such as: Spidering, Discovering default content,
Low-hanging fruit, Low-risk websites, Wide coverage in short time. They do however have certain
limitations as each application is different, operate on syntax not meaning, Price and Quantity are both
numbers, Quantity: unit testing v security testing and Scanners cannot improvise / react to app.
To begin, we will try to perform a files search interesting of the Website (like Robots.txt, git ignore,
gitignore, .svn, .listin, .dstore among others). To accomplish this task, we could use the following tool:
FOCA.
Note: When analyzing the robots.txt file, this file could provide information that will initially be missed by
search engines, thus, allow us to take a starting point to files and directories without references.
Furthermore, we perform an automatic scan of the Website, in making a spidering to find and identify all
URLs. For this task, we could use the following tools: GoLISMERO23 y spider (of Flu-project) among others.
./GoLISMERO.py -c -m -t <WEBSITE>
Also, another task at this phase is to perform a brute force attack to the files and directories to identify
common application - content unbound. To accomplish this task, we could use the following tools: dirb24 y
dirbuster25.
./dirb http://www.<WEBSITE>/ wordlists/big.txt -o archivo.txt
On the other hand, one of the most typical phases where more time can / usually spend is the task of
fuzzing to the various parameters, directories and others, in order to identify different types of
vulnerabilities such as: XSS, SQLi, LDAPi, Xpathi, LFI, or RFI. To carry out this task, we could use the
following tools: PowerFuzzer26, Pipper or the same Burpproxy27 (Repeater & Intruder Tabs).
./pipper.pl "URL/[file]" -v file=big.txt -hc 404
Note: It is important to have a good fuzzing dictionary and to find vulnerabilities identification, a good
example is fuzzdb28.
To conclude this section, it is interesting to use more generic and comprehensive tools for analyzing Web
vulnerabilities. To accomplish this task, we could use the following tools: ZAP, Burpproxy, w3af, arachni,
nikto, Grabber, Wapiti, Webshag or comercial tools as Appscan, WebInspect, Netsparket, etc.

18

https://www.owasp.org/index.php/OWASP_Zed_Attack_Proxy_Project
http://www.acunetix.com/
20
http://www-01.ibm.com/software/awdtools/appscan/
21
https://download.hpsmartupdate.com/webinspect/
22
http://www.mavitunasecurity.com/netsparker/
23
https://code.google.com/p/golismero/
24
http://www.open-labs.org/
25
https://www.owasp.org/index.php/Category:OWASP_DirBuster_Project/es
26
http://www.powerfuzzer.com/
27
http://portswigger.net/burp/proxy.html
28
https://code.google.com/p/fuzzdb/
19

06/2013
53

Manual Testing
In the last phase, we will try to unite and use all the information gathered in previous phases (information
gathering and scanners). To do this, we will perform numerous tests manually to identify potential
vulnerabilities that we have not detected at earlier stages. This phase has a number of additional benefits
such as Intelligent - Can handle previous limitations and eliminates false positives. On the other side,
Manual testing limitations have Time-consuming, Coverage of every field.
The first task to perform manually is to browse through the Website to identify other elements that we
have not previously identified. To accomplish this task, we could use the following tools: Burpproxy, ZAP,
sitescope, or firefox etc.
A second task will be to try to identify the components and plugins that have enabled the Website, as
might be the following types of CMS (Content Managment Systems): Joomla Component, Wordpress
plugin,
Php-Nuke, drupal, Movable Type, Custom CMS, Blogsmith/Weblogs, Gawker CMS, TypePad,
Blogger/Blogspot, Plone, Scoop, ExpressionEngine, LightCMS, GoodBarry, Traffik, Pligg, Concrete5, Typo3,
Radiant CMS, Frog CMS, Silverstripe, Cushy CMS etc.
After identifying the type of CMS or a components such Website, proceed to perform a search for known
vulnerabilities and / or associated with it and try other, which may not have been discovered. To
accomplish this task, we could search by Internet vulnerabilities associated with component and / or plugin
or using specific tools such as: joomla Scan29 or cms-explorer30.
The next task will be the analysis of the headers of the server to identify the server type, version among
other information. To accomplish this task, we could use any tool like a proxy or a simple telnet connection
to the Website or simply typing the target on desenmascara.me, the previous online tool mentioned in the
Information gathering stage

29
30

http://www.enye-sec.org/programas/joomla-scan/
https://code.google.com/p/cms-explorer/

April, 25th 2013

As part of this third phase, fingerprinting should be done to identify the architecture and configuration of
the site. To perform this task, we could use the tool: httprint31

One of the most important tasks at this stage is the modification of parameters, to identify any errors and /
or vulnerabilities. To accomplish this task, we could use any proxy to manipulate the requests to the
Website.

Note: This task is additional and / or part of the task of fuzzing the parameters into Website.
On the other hand, there are many tasks that we must take to identify specific vulnerabilities through the
modification of parameters as:
Alteration of the normal operation of the application by: single quotes , nulls values %00, carriage
returns, random numbers, among others. This allows us to obtain different types of errors when analyzing
it, and could lead to numerous Web vulnerabilities. To accomplish this task, we could use a proxy and
manipulate the Website parameters. For example, PHP technology can perform modification and arrays
31

http://www.net-square.com/httprint.html

06/2013
55

sending a request to end with [].This could cause an unhandled error and provide application information.
Identification and verification of path disclosure through the generation of unhandled errors. To perform
this task we could use a proxy or use as an automatic tool like inspathx.
Identification and verification of vulnerabilities like cross site scripting, sql injection, XPath, SSI, CSRF,
clickjacking among others.
Identification and verification of iframe injection, to carry out this task we can modify the parameter in the
url to identify with something like:
id=folder/file.html por id=http://www.[external-domain]
Identification and verification manual of CSRF (Cross Site Request Forgery). To accomplish this task, we
could try in the forms (usually where most often find this vulnerability). To check this, you will need to copy
an original request (GET / POST) on a form and then make a change in the parameters and re-send the
same request modified. If the server does not return an error, it can be considered that it is vulnerable to
CSRF. To perform this task, we can use the tools csrftester or burp proxy.
Identify and analyze the different types of file extension, allowing knowing specifically the type of
technology used in the Website.
Identification and error handling generated by modifying parameters. This task aims to create controlled or
not errors, by allowing the Website, to provide us with information such as versions, internal IP addresses,
among other information technology used.
Identification and verification of SSL certificates. To accomplish this task, we can use openssl 32 (as well as
TLSSLed33 tools which allows us to verify this information automatically SSL). Finally, we need determine
the period of validity of licenses.
Information gathering of the certificates SSL
./openssl s_client connect <WEBSITE>:443
Note: Types allowed by the server certificates
Testing SSLv2
./openssl s_client no_tls1 no_ssl3 connect <WEBSITE>:443

Identification and verification of encoding supported by the Website. To accomplish this task, we could use
the tool EcoScan34.
Identification and testing HTTP methods in the Website. To accomplish this task, we could use a proxy or
client that allows us to interact with the Website, once the connection is made, tested different HTTP
methods (HEAD, PUT, OPTIONS, TRACE, PROPFIND, CONNECT).
Note: With cadaver tool, we could exploit the webdav methods (if are enabled).
./telnet <WEBSITE> 80
Trying 67.X.X..18...
32

http://www.openssl.org/
http://blog.taddong.com/2013/02/tlssled-v13.html
34
http://open-labs.org/
33

April, 25th 2013

Connected to <WEBSITE>.
Escape character is '^]'.
OPTIONS / HTTP/ 1.1
Host: 67.X.X.18
HTTP/1.1 200 OK
Date: Mon, 15 Apr 2013 16:13:08 GMT
Server: Apache
Allow: GET,HEAD,POST,OPTIONS
Identification and search for comments, variables, debug information, values and other information, which
should not be in the HTML. To accomplish this task, we need to see the source code of the application and
search for keywords, variables or comments that could give us some interesting information.
Identification and analysis of Flash files in the Website. To accomplish this task, the first thing to do is
identify and download all flash files that exist on the Website. To do this, we could use the Google search
engine:
filetype:swf site:domain.com
On the other hand, we could also find a swf files with wget tool:
./wget -r -l1 -H -t1 -nd -N -nd -N -A.swf -erobots=off <WEBSITE> -i output_swf_files.txt
Note: In the above example; -r-l1 -> We will search only one level and in each subdirectory found
recursively, -t1 -> Only makes a connection attempt, -nd -> We copy the directory files directly in Ethernet
frames rather than real -N -> timestamp Preserves original file that is downloaded [-np] no parent, no
follow links to parent directories, only the current and one down, for-r-l1, [-A.swf] wget -A indicates the
type of file to download only, in this case only "swf". [-erobots = off] And finally, this prevents files wget
ignore 'robots.txt' that may have, as it may be that within these files has indications that subdirectories
seekers should not see (including wget) . With this we avoid it and look at the whole level.
Once we have identified and downloaded *.swf files, we must analyze the code, the functions (as
loadMovie) variables in order to identify those that call and allow other types of vulnerabilities such as
cross site scripting. Below shows some vulnerable functions:
_root.videourl = _root.videoload + '.swf';
video.loadMovie(_root.videourl);
getURL - payload. javascript:alert('css') getURL (clickTag, '_self')
load* (in this case: loadMovie) - payload: as
function.getURL,javascript:alert('css')
TextField.html - payload: <img src='javascript:alert("css")//.swf'>
To accomplish this task, we could use the tools Deblaze35 and SWFIntruder36 among others. We should also
analyze the parameter AllowScriptAccess, Flash Parameter Pollution or sensitive APIs as:
35
36

http://deblaze-tool.appspot.com/
https://www.owasp.org/index.php/Category:SWFIntruder

06/2013
57

loadVariables, loadVariblesNum, MovieClip.loadVariables, loadVars.load, loadVars.sendAndLoad


XML.load, XML.sendAndLoad
URLLoader.load, URLStream.load
LocalConnection
ExternalInterface.addCallback
SharedObject.getLocal, SharedObject.getRemote
Identification and analysis of Java files, in order to carry out this task we could use the tool Jad
Identification and analysis of files. Net, to carry out this task we could use the tool Dis#
Another task to be done, will be to control sessions, where we try to identify the operation which has been
developed and how the sessions are. To accomplish this task, we need to reverse engineer on sessions and
manipulate them once we know how it works. Also, in some cases we may find that the session IDs show
sensitive information linked to the identity of a user.
Besides control session, we were missing one of the areas that can give us more joy if we exploit and that
is, the identification and analysis of Authentication Systems. To do this, one of the first things to determine
in the area of authentication is to verify if the Website stored the credentials in the browser.

Normally it should proceed with attacks on default accounts and dictionary attacks. When we have
identified an authentication system, we detect its architecture and then try usernames and passwords
known to the architecture, implementation and other default accounts as: admin, administrator, root,
system, user, default, name application, among others.
To carry out this attack can do it manually or with tools like hydra37.
./hydra L users.txt P pass.txt <WEBSITE> http-head /private
It is also important to do a brute force and default credentials, to try bypass the authentication and
checking if the authentication system is configured and properly protected.
On the other hand, access should try to be gained in those areas which initially / supposedly are only
accessible through authenticated users, commonly known as cross directories (directory traversal). It is
also important to verify that the connections and settings that make up the typical credential retrieval
system, is configured correctly.
Finally, we note that all these tests and phases of a Web vulnerability assessment (scanners and manual
testing) can be carried out in white box (with credentials of the application) and black box (without
application credentials).
37

http://www.thc.org/thc-hydra/

April, 25th 2013

Conclusion
As we have seen, for a complete analysis of vulnerabilities in a Website, we must take into account many
variables and conduct from all the steps described in this article (from information gathering to a manual
analysis and having a good automatic analysis with many tools available).
Finally, it is important to note that each of the tests performed have been conducted under certain
conditions, which may be susceptible to changes (updates, change settings, etc), therefore when trying to
replicate similar testing on the same websites, it may have different results.

About the author

Francisco Caballero is a Security & Forensic Analyst at S21sec38


(Spain), he has a Master of Science in Digital Investigation and
Forensic Computing from University College Dublin (UCD). He
has previously worked as Technical Director in S21sec Mexico
and International Support Manager in S21sec Argentine.
Francisco Caballero is also a founder of White Hack Conferences, and also he has
wide experience as a computer security presenter in many computer security
conferences in Mexico and Spain. His research interests span the areas of
forensic computing, cybercrime investigations, web app security and pentesting.
He has written many papers on pentesting and forensic and usually works as a
security researcher. Recently he has been participating in the first exercise of
cyberdefense in Spain and the first cyber escalation Table Top (TTX) exercise
with the Cyber Security Forum Initiative (CSFI39).

38
39

http://www.s21sec.com
http://csfi.us/

06/2013
59

Thank you for reading our magazine from cover to cover. Please share with us your
comment about this issue on Twitter or Facebook:

@Hackinsight
http://www.facebook.com/hackinsight

Become our Beta Tester - send us a message to:


info@hackinsight.org

The techniques described in our articles may only be used in private, local networks.The editors hold
no responsibility for misuse of the presented techniques or consequent data loss.

Вам также может понравиться