Archive for July, 2008

Compiling Capture-HPC on VMWare Server 1.0.6

Monday, July 28th, 2008

We often use Capture-HPC as a high interaction client honeypot for analyzing suspect URLs, but getting it up and running on a new platform can sometimes be a somewhat frustrating and time consuming process. I’ve recently had to repeat the build process on the latest version of VMWare Server (release 1.0.6 build-91891) running on Ubuntu Gutsy, so in case this saves anyone else some pain, this is what I had to do to make it work:

1) Download the latest sources (at the time of writing this was capture-server-2.1.0-300-src.zip)

2) Extract the latest sources

unzip capture-server-2.1.0-300-src.zip
cd capture-server-2.1.0-300-src

3) Ensure the necessary build dependencies were installed

sudo aptitude update ; sudo aptitude install ant ant-optional sun-java6-jdk sun-java6-bin sun-java6-jre
sudo install VMWare-Server-1.0.6-build-91891

4) Set the correct environment variables

  JAVA_HOME=/usr/lib/jvm/java-6-sun-1.6.0.03/ ; export JAVA_HOME
  VIX_HOME=/usr/lib/vmware-vix/ ; export VIX_HOME
  VIX_INCLUDE=/usr/include/vmware-vix/ ; export VIX_INCLUDE
  VIX_LIB=/usr/lib/vmware-vix/ ; export VIX_LIB
  ANT_HOME=/usr/share/ant/ ; export ANT_HOME

5) Hack the revert compilation shell script:

chmod +x compile_revert_linux.sh
cat compile_revert_linux.sh
#!/bin/sh
echo $VIX_INCLUDE
#gcc -I $VIX_INCLUDE -o revert revert.c $VIX_LIB/libvmware-vix.so
gcc -I $VIX_INCLUDE -o revert revert.c /usr/lib/libvmware-vix.so

6) Remove any of the logic from build.xml that refers to the Windows OS branch:

vi build.xml
<?xml version="1.0"?>
<project name="CaptureServer" default="release" basedir=".">
        <!-- all stuff to get the jni wrapper compiled -->
        <taskdef resource="net/sf/antcontrib/antcontrib.properties"/>

        <condition property="os" value="unix">
        <os family="unix"/>
    </condition>

         <property environment="env"/>
     <property name="src" value="."/>
     <property name="build" value="build"/>
     <property name="release" value="release"/>

     <target name="init">
          <mkdir dir="${build}"/>
                  <mkdir dir="${release}"/>
         </target>

     <target name="compile" depends="init">
          <!-- Compile the java code -->
          <javac srcdir="${src}" destdir="${build}" debug="true" debuglevel="lines,vars,source"/>

                  <!-- Compile the revert code -->
                   <exec command="sh" executable="./compile_revert_linux.sh"/>

     </target>

         <target name="jar" depends="compile">
        <mkdir dir="${build}/jar"/>
        <jar destfile="${build}/jar/CaptureServer.jar" basedir="${build}">
            <manifest>
                <attribute name="Main-Class" value="capture.Server"/>
            </manifest>
        </jar>
    </target>

        <target name="release" depends="clean,compile,jar">
                <copy file="${build}/jar/CaptureServer.jar" todir="${release}"/>
                <copy file="./COPYING" todir="${release}"/>
                <copy file="./Readme.txt" todir="${release}"/>
                <copy file="./input_urls_example.txt" todir="${release}"/>
                <copy file="./config.xsd" todir="${release}"/>
                <copy file="./config.xml" todir="${release}"/>

                    <exec executable="cp">
                      <arg value="./revert"/>
                      <arg value="${release}"/>
                    </exec>

                <zip destfile="./CaptureServer-Release.zip" basedir="release"/>
        </target>

        <target name="clean">
        <delete dir="${build}"/>
                <delete dir="${release}"/>
                <delete>
                        <fileset dir="." includes="revert.exe"/>
                        <fileset dir="." includes="revert"/>
                        <fileset dir="." includes="CaptureServer-Release.zip"/>
                </delete>
    </target>
</project>

6) Compile the Capture Server

ant
Buildfile: build.xml
  [taskdef] Could not load definitions from resource net/sf/antcontrib/antcontrib.properties. It could not be found.

clean:
   [delete] Deleting directory /home/david/client_honeypots/capture-server-2.1.0-300-src/build
   [delete] Deleting directory /home/david/client_honeypots/capture-server-2.1.0-300-src/release

init:
    [mkdir] Created dir: /home/david/client_honeypots/capture-server-2.1.0-300-src/build
    [mkdir] Created dir: /home/david/client_honeypots/capture-server-2.1.0-300-src/release

compile:
    [javac] Compiling 32 source files to /home/david/client_honeypots/capture-server-2.1.0-300-src/build
    [javac] /home/david/client_honeypots/capture-server-2.1.0-300-src/capture/ClientFileReceiver.java:9: warning: sun.misc.BASE64Decoder is Sun proprietary API and may be removed in a future release
    [javac] import sun.misc.BASE64Decoder;
    [javac]                ^
    [javac] /home/david/client_honeypots/capture-server-2.1.0-300-src/capture/ClientFileReceiver.java:42: warning: sun.misc.BASE64Decoder is Sun proprietary API and may be removed in a future release
    [javac]                             BASE64Decoder base64 = new BASE64Decoder();
    [javac]                             ^
    [javac] /home/david/client_honeypots/capture-server-2.1.0-300-src/capture/ClientFileReceiver.java:42: warning: sun.misc.BASE64Decoder is Sun proprietary API and may be removed in a future release
    [javac]                             BASE64Decoder base64 = new BASE64Decoder();
    [javac]                                                        ^
    [javac] Note: /home/david/client_honeypots/capture-server-2.1.0-300-src/capture/MockClient.java uses unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 3 warnings
     [exec] The command attribute is deprecated.
     [exec] Please use the executable attribute and nested arg elements.
     [exec] /usr/include/vmware-vix/
     [exec] revert.c:232:2: warning: no newline at end of file

jar:
    [mkdir] Created dir: /home/david/client_honeypots/capture-server-2.1.0-300-src/build/jar
      [jar] Building jar: /home/david/client_honeypots/capture-server-2.1.0-300-src/build/jar/CaptureServer.jar

release:
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
      [zip] Building zip: /home/david/client_honeypots/capture-server-2.1.0-300-src/CaptureServer-Release.zip

BUILD SUCCESSFUL
Total time: 2 seconds

7) Extract the newly made CaptureServer-Release.zip file into a suitable location (such as a newly made capture-server-2.1.0-300 directory).

8) Configure config.xml and run as normal, such as via:

cd capture-server-2.1.0-300
vi config.xml
/usr/lib/jvm/java-6-sun/bin/java -Djava.net.preferIPv4Stack=true -jar CaptureServer.jar -s your_ip:7070 -f input_urls_example.txt

Hopefully Capture-HPC should work cleanly after that.

NOTE: If you experience problems running Capture and find you receive this error when attempting to run the server:

VIX Error on connect in connect: One of the parameters was invalid

check that your VMWare Server installation was clean by removing VMWare Server (vmware-uninstall.pl), finding any vmware related files in /usr, deleting them and then reinstalling VMWare. I found that one of my VMWare Server upgrades had left a number of vmware-vix shared libraries on disk and these seem to cause the newly compiled Capture Server to fail to connect on revert.

For more trouble shooting details, see this thread on the public Capture-HPC mailing list:

http://public.honeynet.org/pipermail/capture-hpc/2008-August/000431.html

The Sad State of IT Security

Monday, July 14th, 2008

On Friday I found out that my credit card had been used, by nefarious persons unknown, to buy £500 worth of goods online. Bad enough, but this is the second time this has happened in four years.

At this point I can hear the reader’s thoughts: stupid bugger, he’s been p0wned, got malware on his machine. Well, it’s possible. Like nearly everyone out there, my machine might have been 0wn3d by someone really good. Unless your name is H.D.Moore, there’s always someone out there better than you. But it’s unlikely. I know exactly what should be running on my machine, I know what programs can talk to the outside world, I look at tcpdumps and use a browser + OS combination that’s not currently targetted in the wild. I think I can be reasonably confident that the only malware on my machine is the stuff that’s put there by me so I can study it.

So if my machine is clean (with high probability), I haven’t lost my card (100% certain as I have it with me now) and I shred all my bank statements, bills and till receipts (yup), how come I’ve still been defrauded?

I use my card online a lot. I don’t gamble online, buy porn, dodgy pills, email my card details around or send my details to nice gentlemen in Nigeria but I do buy stuff from a range of shops, small and big.

So my best guess is that my card has been taken from a merchant. What could I do to stop this happening?

Two options:

1) Never spend money online. Very limiting and not going to happen. Even if I was willing to live with the inconvience, it doesn’t give 100% protection anyway: my card could still be stolen if I use it at a bricks and mortar store (e.g. anyone who shopped at a store in the TJX group had their card placed at risk after card details were stolen). I’m certainly not going to stop using my card totally.

2) Only ever spend money with the biggest online shops: ones that are big enough to have their own security teams, do code audits etc etc. Stick with amazon.co.uk and tesco.com. Not foolproof, but a reasonable reduction in risk. The problem with this is that a lot of stuff I want to buy online is only available from smaller shops. Worse, it’s only available from mid-sized retailers. Ones that are too big to just use Paypal, big enough to have their own in house ASP or PHP developers, but not big enough to do it right.

You might think I’ve missed an option there: ‘3) Only buy from trusted retailers’. The trouble is that as a consumer, even one much more knowledgeable about security than most, there is no way I can make any valid judgement about a retailers security or lack thereof. I don’t have access to any information that will let me evaluate a retailers security, and without that information being available, there’s also no competitive pressure on stores. Instead we have to rely on the banking groups dragging standards upwards via things like the PCI DSS standards. These are good, but it’s a long slow grind.

Conclusions? My card has been stolen, it’s quite possible it’ll happen again, and there’s nothing I can do about it except to never use my card. Worse, because online crime is now a low priority for UK Police, I don’t even get to report this to the police, only to my bank, and I can be pretty confident that no-one will ever be charged for this (they weren’t last time even though I did report that incident to the police as it predated the new reporting arragements).

This is not a happy state of affairs. If the definition of distributed computing is the failure of a machine whose existence you don’t know about breaking something you are doing, then this is the security version: being compromised by systems you don’t know about and can’t influence.

Arthur

Phishers branch out in their targetting

Tuesday, July 8th, 2008

Phishers have been branching out recently, moving on to new targets away from the traditional bank account scam. As users become more aware, and more banks roll out two factor authentication and other mitigations, scammers are having to move on to softer targets.

In the past few months we’ve seen two new targets, with different motivations. Both of these targets show trends in attacks as some targets become hardened.

First, many UK Universities have been hit with targetted phishing scams, usually claiming to come from “IT Support”. Any compromised accounts are then used to send out more spam. It’s a nice example of accounts being useful not so much for the information in them, but for the access they provide to other resources: bandwidth and credible email addresses

Second, as mentioned by Dancho Danchev in May in ZDNet and in June on his blog, job sites are coming under attack. Dancho posted about the selling of tools that scrape information from CVs posted to online sites. Now we are seeing more direct attacks, with phishing emails aimed at getting login details of users of Monster.com and other job sites. Clearly gaining access to the information held on a job site is very useful to a scammer: it makes all sorts of nastiness easier.

It’s an arms race out there. Banks are now very quick at taking down phishing sites (see the recent blog from Ross Anderson’s group at Cambridge with links to stats on takedown), but other types of scams currently last much longer. If you’re one of the bad guys, it makes sense to go for the low hanging fruit. Why bother to steal someones online banking details when you can get more money for less work by stealing their identity? And why bother to go to lots of work to get their details when they have helpfully posted it on the web for you, all ready to use?

Arthur

Global Browser Vulnerability Survey

Friday, July 4th, 2008

A lot of current computer security threat research activity today occurs in the client space, with honeyclients such as Capture-HPC and PhoneyC regularly being used to study attacks against web browsers. Often these attacks occur through malicious obfuscated javascript and exploitation of vulnerable plugins or media extensions to allow fully automated ‘drive by download’ infections. The Honeynet Project have published a number of Know Your Enemy whitepapers in this area over the past year, and continue to actively research in this area. We have also previously blogged about some of the ideas the UK Honeynet Project have been experimenting with in this area.

One of the biggest challenges with client based threats is assessing the real world scale of the potential problem. For traditional server based threats, it was fair simple to survey the entire IPv4 space and determine what versions of a particular application or operating system were in active use at a particular time. However, for client threats, you need a client application to come to you and interact with a service before any assessment of potential client vulnerabilities can be performed. This is a significant challenge for both attackers and researchers (hence the continued use of indiscriminate spamming and malicious advert serving at the same time as more targeted attacks are also being developed).

As the world’s most popular search engine, Google record the user agent client version data from the billions of web searches made by an estimated 75% of Internet users, and is therefore one of the organisations most likely to be able to provide an assessment of the current state of web browser security (Microsoft’s MSRT also has excellent data, but only for the ~450 million users regularly running Windows Automatic Updates). However, for obvious privacy reasons, this data has not been made available to the public.

An interesting survey was released yesterday by Google Switzerland, IBM ISS and the Computer Engineering and Networks Laboratory of the University of Zurich, which provides the first systematic study of the browser data from around 1.4 billion Google users during the first half of 2008. They analysed Google’s client version data and correlated this with vulnerability data from sources such as Secunia’s PSI, in an attempt to assess how many vulnerable browsers were in circulation at a particular time.

The results are very interesting, with Internet Explorer taking 78% (1.1 billion) of the browser share and Firefox getting 16% (227 million). Drilling down deeper into the IE market share shows roughly half of IE users have now moved to IE7, whilst most FF users run the latest release. More worryingly, less that 50% of IE uses had the most secure version of their browser (rising to 83% in FF). For the month of June 2008, the authors suggest that over 45% web surfers (roughly some 637 million people) accessed Google with a browser that contained unpatched security vulnerabilities. There is also some interesting analysis of the exposure to plugged in as well as inbuilt vulnerabilities, plus some good recommendations for potential improvements to web browser security. In particular, the concept of web sites checking a browser’s agent strings and displaying a highly visible “expiry date” warning on every page (in an attempt to enforce a maximum shelf life) is worth further investigation.

The very welcome paper is definitely worth a read, but is unlikely to cause too much immediate worry to the cyber criminals who are actively targeting web users through the thousands of mass compromised web servers, phishing emails and instant message spam we encounter each day.

FIRST 2008

Tuesday, July 1st, 2008

The Honeynet Project were asked to present at the 20th FIRST conference in Vancouver last week, as part of their Network Monitoring Special Interest Group on Fast Flux Service Networks. We set up a two hour session broken down into three equal sections:

  1. An introduction to the basic mechanics of fast flux (David Watson, UKHP)
  2. Current ATLAS fast flux statistics (Jose Nazario, Arbor)
  3. Detection and mitigation (Christian Gorecki, University of Mannheim)

The NM-SG session was open to FIRST members only, so the slides are not publicly available, but we hope to have a public release of similar material shortly. We had a number of questions, and feedback from the attendees seems to have been positive.

There were three additional short demos:

  1. Florian Weimer of RUS-CERT showed some new passive DNS tracking information
  2. Tillmann Werner from the German Giraffe Honeynet Project Chapter demonstrated how Honeytrap, LibEmu and Nebula can be used to analyze unknown attacks, which is looking very promising as a long term replacement for Nepenthes
  3. Piotr Kijewski of the Polish CERT/NASK gave a brief demonstration of their still under development HoneySpider web interface, which shares many of the features of client honeypot systems that we are currently working on but instead uses Java and Rhino instead of Python and SpiderMonkey

Overall it was an interesting event, with some good talks and lot of opportunities to meet up with a different group of people very active in the security operations and incident response fields. Quiet a few Honeynet Project members were also present, which always encourages a little extra R&D discussion. Hopefully we’ll see some spin off activity in the coming weeks.

Many thanks to Carol Overes from GovCERT in Holland for the invite.