It has begun……..

Knowledge is an endless road, if your hungry enough for it.
“Trust yourself. You know more than you think you do.”

“Build up virtue, and you master all.” – ~ Lao-tzu quotes from Tao Te Ching

Black Hat 2014 – Day 1

The first day has come and gone here at Black Hat 2014. It seemed the common theme for the briefing discussions was focused around the POS since the exposure from the Target breach. Amongst all the different variations of these discussins, it all came down to legacy deployment and slow progression of remediation and improvements to keep with the threats that continues to increase.

Aside to the briefings the Arsenal session continues to not disappoint. Many smart minds demonstrating their products as well as many ways to leverage the usage of existing open source tools.

This year SecureNinja hosted “Kazian 2.0″ another CT and I had an opportunity to participate with my son. It was definitely a full simulation of a CTF, but I was not prepared to get my mind in that state especially after a long day.

All in all the 1st day was very productive and let’s not leave out the after party events that is always a blast to attend.

On to day 2…..



Black Hat / Defcon 2014

The gathering of professionals, security enthusiasts and geeks of the world is once again here for 2014. Black Hat will be hosted at the Mandalay Bay and Defcon 22 at the Rio Hotel & Casino. I look forward to the vast amount of information being delivered in the briefings this year and I will once again post my feedback for each briefings that I attend.

Stay tune and see you there…..

Pentesting Web Applications The Process – “Not Just Another Report”

Pentesting Web Applications is usually conducted quarterly or on an annual basis by a third party vendor to ensure segregation and regulatory requirements are met. The level of testing would depend on the complexity of the application requiring specialized knowledge of the application and application development processes which can be very time consuming. These circumstances can result in varying costs in the scope of work.

Web Application testing may be mandated depending on your company’s regulatory requirements. To meet compliance, more frequent assessments may be required. Engaging third-party vendors on a monthly basis could be cost prohibited. Therefore, companies face decisions on which options to take such as leveraging cloud solutions that provide a level of Website Vulnerability management, or hire a dedicated staff member for the diligent efforts involved in testing.

Designating a staff member to focus specifically on the Web Application testing as their primary role is not always possible. The typical challenges companies deal with are the limited staffing required to support day-to-day security operations. The additional overhead of Web Application testing cannot just be added as a daily activity to a security operations team and is a highly specialized skill-set not typically practiced by common security professionals. In addition, an attempt by a non-seasoned professional would not be conducive to the quality of testing that needs to be conducted against the web application.

Using a cloud based testing service can alleviate some of the overhead of performing regular testing or the need for additional staffing by offering assessment services that would include monthly reporting. The service offerings on certain programs can vary in pricing when requiring a deeper evaluation of the application. However, the report distribution is just that, “just another report” highlighting the findings based on the OWASP top 10 to be presented to the developers.

Developers review the results and glance over the terms such as your common XSS, SQLi and ask “then what?” or “what does that all mean?” Fundamentally, this is where the cloud solution services ends. This is not saying that the cloud services don’t have the option to include weekly discussions with the team and developers explaining the findings and providing examples of what it means from an exploit-level. It would require a more costly program when choosing a cloud solution.

In my experience, such services lack certain aspects that only an internal Assessment program can deliver. Some advantages would include the capabilities for deeper evaluation testing, collaboration benefits with the developers and intellectual knowledge of the development lifecycle process within the company. Developers appreciate a more personal interaction to assist in reviewing reports, explaining the findings and demonstrating the exploits against vulnerabilities identified, rather than just a monthly report on the findings based on the OWASP top 10 with no context on what it all means. This interaction allows the findings to be more actionable and ensure proper focus on the remediation efforts prior to new code release.

I was assigned a task to develop a program that would allow the company to integrate Web Application assessment activities as an operational duty. Unfortunately, there isn’t an out of the box “How-to” of aligning such a program as part of a security professional’s daily function. The process outlined here is my effort to establish such a program and may not be a fit for all environments.

I started with interviewing the current staff members and discovered that the skill-sets were limited with regards to Web Application testing and that only concepts were understood. Fortunately there was an individual that had prior experience. The next effort for the program required balancing the time that would be spent by current staff members and establish a repeatable process that can be executed monthly with a testing level that would be equivalent to a third party engagement.

The developers have their own process for testing during the code reviews, mainly around functionality with minimum focus around security. This development culture is slowly improving with developers aligning their coding and security best practice (from a security perspective) to ensure the functionality is secure. This additional process would compliment what the developers are already doing and strengthen the testing validation throughout the SDLC process to improve the security posture in preparation for the quarterly or annual testing engagement by the external testers.

At this point we have established a timeline that coincides with the developer’s process. The next effort was to evaluate the necessary skill-sets of team members and tools required. I would like to highlight the importance that the security personnel designated for these activities have in-depth experience with the tools and processes of pentesting web applications to ensure the rigorous testing activities are met within the program timeline.

To ensure that the security member can manage time to support other duties during the day, products such as Core Security or Rapid7 would assist on the aggressive timeline outlined. These products offer a level of automated workflow throughout the assessment to reduce the amount of effort and time spent validating results. Depending on the complexity of the application, certain vulnerability findings would require manual validation to ensure common security frames are tested further which include data/input validation, authentication and session management just to name a few. Web application testing tools, such as Burp Suite and w3af are commonly used by security professionals in assisting testing of these common frames.

My approach is to align with the Software Development Life Cycle of new code releases and a timeline to establish a repeatable process. This strategy would allow a seamless integration of the process and allow the time required to collaborate with the developers on any findings. The discussion would assist the developers on prioritizing any remediation effort and understand the risks of the discovered vulnerabilities. The timeline captured may vary with other environments dependent on the frequency of code release by the developers group. For the purpose of this article, it is based on a 30 day code release cycle of multiple web applications.
Figure 1 outlines an example of weekly activities that would need to transpire before the next code release on a month to month basis. The proposed timeline would need to be presented and accepted to provide ample time to incorporate any remediation changes to their code.


Figure 1 – Example Program Timeline


The breakdowns of the Program Timeline are highlighted as followed:

  • week 1 – Initiate web application scanning for vulnerability
  • week 2 – Review results, validate any findings to eliminate false/positive and capture evidence
  • week 3 – Report creation, distribute results and discuss findings
  • week 4 – Developers incorporate any code remediation based on findings

The timeline for the first week of the month would be preparing the scheduling of the scanning activities. This would include adjusting any jobs within the Web Application vulnerability tool to reflect the upcoming scan schedule, submitting and following any change request process to obtain the appropriate approvals, execute and monitor the scans, figure 2. This process can vary depending on the change management process on when the request gets fully approved, to the number of web applications required to be tested and may exceed or be less than the timeline outlined.

Figure 2 – Example Scanning Timeline


If scanning multiple web applications, throughout the scan process I find keeping a compiled list of sites and hosts either via spreadsheet or an export from the vulnerability tool; helps in prioritizing the validation efforts when knowing the start times the scans will execute.

Figure 3 – Example compiled list

You may be asking why scan the external site and internal hosts, basically 3 times per site as listed in figure 3.

The concept here is multiple avenues:

  • To identify the consistency and variances of the code released for each web farm
  • Compare internal and external vulnerability results
  • Test the external compensating controls (i.e., Web Application firewalls, IPS)
  • Validate & test internal results not identified from an external scan to determine false/positives

In short, results can be identified internally that may not be exposed externally and the validation of the results would confirm that it may not be exploitable from the outside due to the compensating controls implemented such as web application firewalls. With the controls in place, this would provide the developers time to further review the vulnerabilities and prioritize the remediation effort for scheduling on the next code release.

The timeline for the second week of the month outlines the hours that will be spent by the engineer in reviewing the scan results and maintaining other daily duties, figure 4. The assessment activities will include confirming and validating any findings from the produced scan reports of the Web Vulnerability tool. This includes capturing any screen shot evidence for confirmation of the exploit to be incorporated into the final report. This week’s process would also vary and can require a good amount of manual validation effort dependent on what results have been discovered. I found that prioritizing the manual testing on multiple web applications towards the end of the week allowed for the common testing to be validated and completed more efficiently. This would give you ample time and focus required to execute the thought process involved with the manual validation.

Figure 4 – Example Operations Timeline

The activities during this week will be the most involved and aggressive compared to the other timelines due to the multiple Web Applications being tested. The compiled list from figure 3 will act as checklist to assist in the process for tracking which applications have been completed. You can begin reviewing the results and confirming the vulnerabilities to identify the false/positives in parallel. By identifying the false/positives and excluding them from the results, provides the actionable content in the report for the developers. This is where this process provides the value and efficiency and “not just another report”.

In collaborating with my network of professional consultants, the common processes that have been observed during their client engagements are investments being made towards an Enterprise Vulnerability Management tool for a one-click solution. They rely on the tool to initiate the scans, produce the results and auto email the reports to the developers. Developers would review the report and would dispute the results with the security team leading into debates of who’s right and who’s wrong. This is where the breakdown of the process delays the remediation efforts.

The third week involves compiling all evidence and creating the final report. The report format would be structured as an Executive Summary that would include the necessary screen shots demonstrating the validity of the findings and exploit probability. This report would be a custom report not produced by the scanning tool, so that the false/positive are not included as part of the findings. When the report is completed schedule a review meeting and distribute the results to discuss the findings with the development team. The review meeting is the opportunity to answer questions from the developers regarding the results in the report and assist in prioritizing the actionable line items for remediation in the upcoming code release. A separate technical report with the details of the vulnerabilities can be produced for the team during the final week or upon request shortly after the meeting.

The final week is to allow developers to go over the detailed technical report and make any adjustments that will be incorporated into the code release for the month.

In conclusion, the key success factor in this process is an established relationship between the security department and the developers. This effort is quite challenging with the development team, so a liaison within the developers group would be just as effective. This process is in no means making light of the execution of how a full penetration testing of web applications should be conducted. It provides the common challenges for busy IT Security departments with a process that would allow them to incorporate and execute an assessment program as part of their daily duties.

The outcome of these efforts provides the security team a repeatable process at a manageable level in Pentesting Web Applications. This level is balanced with the daily operational activity needed in a one week focus for such assessment during each month. The findings presented are actionable to the point that when the developers receive the results they can ensure that what is presented is communicated clearly, explained so that it is understood and is “not just another report”.

Authored by Rey Ayers and published in the 01/2013 Issue of “Pentest Magazine WebApp

Government-Funded Hackers Say They’ve Already Defeated Windows 8′s New Security Measures

Last week’s Windows 8 launch wasn’t just a major product release for Microsoft. It seems to have been a banner day for the government-funded hackers who take Microsoft’s software apart, too.

On Tuesday the French firm Vupen, whose researchers develop software hacking techniques and sell them to government agency customers, announced that it had already developed an exploit that could take over a Window 8 machine running Internet Explorer 10, in spite of the many significant security upgrades Microsoft built into the latest version of its operating system.

“We welcome #Windows 8 with various 0Ds combined to pwn all new Win8/IE10 exploit mitigations,” Vupen’s chief executive Chaouki Bekrar wrote on twitter Tuesday, using an abbreviation for the industry term “zero-days” to refer to security vulnerabilities unknown to Microsoft that his team has discovered in the company’s software, as well as the hacker slang “pwn”–to hack or take control of a machine.

READ THE REST HERE by Andy Greenberg, Forbes Staff FORBES.COM

Hacker’s Demo Shows How Easily Credit Cards Can Be Read Through Clothes And Wallets

Pull out your credit card and flip it over. If the back is marked with the words “PayPass,” “Blink,” that triangle of nested arcs that serves as the universal symbol for wireless data or a few other obscure icons, Kristin Paget says it’s vulnerable to an uber-stealthy form of pickpocketing. As she showed on a Washington D.C. stage Saturday, she can read all the data she needs to make a fraudulent transaction off that card with just a few hundred dollars worth of equipment, and do it invisibly through your wallet, purse, or pocket.

At the Shmoocon hacker conference, Paget aimed to indisputably prove what hackers have long known and the payment card industry has repeatedly downplayed and denied: That RFID-enabled credit card data can be easily, cheaply, and undetectably stolen and used for fraudulent transactions. With a Vivotech RFID credit card reader she bought on eBay for $50, Paget wirelessly read a volunteer’s credit card onstage and obtained the card’s number and expiration date, along with the one-time CVV number used by contactless cards to authenticate payments. A second later, she used a $300 card-magnetizing tool to encode that data onto a blank card. And then, with a Square attachment for the iPhone that allows anyone to swipe a card and receive payments, she paid herself $15 of the volunteer’s money with the counterfeit card she’d just created. (She also handed the volunteer a twenty dollar bill, essentially selling the bill on stage for $15 to avoid any charges of illegal fraud.)

READ THE REST HERE by Andy Greenberg, Forbes Staff FORBES.COM

« Previous Entries