Category Archives: General

Security Mindset

User big brother 1984

User big brother 1984 (Photo credit: Wikipedia)

I’m a big fan of Derren Brown, perhaps not so much of his actual performance stuff, but rather his later work on psychology and human manipulation. I’ve not seen all of his programmes, although I plan on going looking for some since I found they existed through the wikipedia link above, but I did just finish watching the “Fear & Faith” pair that I had recorded from a few weeks back on Channel 4 in the UK. There was one particular point that he made that was of interest to me:

People behave better when they have the impression that they are being watched.

Now, after an earlier discussion about AUPs on Forensic Focus where I wrote a draft, simple AUP, I realise that this is what I left out. There is neither mention of consequences, nor is there mention of monitoring – an oversight which I acknowledge leaves the policy toothless. In my defence though that wasn’t the point that I was trying to make at the time !

The research study by Max Ernest-Jones, Daniel Nettle and Melissa Bateson at Newcastle University on “Effects of eye images on everyday cooperative behaviour: a field experiment” further builds on previous research by Terrence Burnham and Brian Hare ( here ) showing that even computer generated “eyes” watching will influence behaviour.

I recall from my first ( and last ! ) permanent role, a Government issued poster, hanging in what very much resembled something that was very reminiscent of Chernobyl ( unsurprisingly really, as it was Hanger 4 at Harwell, home of GLEEP. ) We kept our backup tapes in a room which used to house a Cray – I’d be lying if I said I knew which one, it was long gone by the time I arrived, but I do know that it was one with integral seating … – and it had all of the security that you’d have expected of a data centre on a nuclear site – man-trap doors, security office, etc. – and, some of these posters – I wish now that I’d “redistributed” them before we left the building and it was pulled down – but I was young and foolish, and had no idea that I’d be writing this blog now … The one that sticks in my mind was rather creepy, hanging between the two doors of the man-trap as it was, bored people had messed with it – picking out the eyes with pins giving the poster a very unnatural stare. I don’t know if I behaved any better for it, all I had to do was collect and drop off tapes as it was – the room was cold, empty and unfriendly I didn’t hang around long enough too misbehave. I’ve tried my best to find a copy of it online now, but with no success. I did get these though:

security_poster_1960 security_poster_1962

This first one ( Don’t Brag ) is from 1960 ( I’m told ). And the second from 1962 ( again, I’m told ).

Both are notable for their lack of eyes, as, oddly are many, if not all of the ones that I could find that are currently being circulated.

CESG

I rather like these Welsh ones by Rebecca Lloyd as she says herself – inspired by the very popular iPod adverts.

welsh1 welsh2 welsh3

Quite entertainingly, the most intimidating poster by far and the one with the most eyes, with massive reference to 1984 and a horrendous secret state is this one from Transport for London. Nothing to do with InfoSec per se, but general CCTV surveillance of society.

TFL_CCTV

That’s the sort of thing that nightmares are made of ! On the other hand, if that was stuck before me on a bus, I might well not misbehave – which is a win on the part of the designer !

So there are two things that we should consider then – first off – my oversight on the AUP with regard to consequences and monitoring should be resolved – the addition of something like :

We like to be sure that nothing untoward is happening the machines which are our responsibility, so we do monitor them for things that we have said we don't like. If, once you have signed this document to signify your understanding, you choose to break the agreement you've made, we will have to take disciplinary action, depending on the seriousness of the breach, this could include losing your job.

Secondly, as ongoing awareness of Information Security is a requirement of pretty much every set of best practice guidelines ( and if it isn’t, it should be ! ) perhaps we should make sure that we make use of strategically placed posters with eyes in order to get our point across with the maximum uptake ? How about the following:

Poster1Poster2Poster3

I know that for two out of three, they aren’t exactly “watching” eyes, but there needs to be a line drawn on the amount one intimidates one’s employees !

I leave you with a Seasonal Poster – courtesy of the US Archives ( which are fabulous by the way, can we have a UK one of these please ? ) You’ll need to view it full size to see what the “security” message is.US Christmas InfoSec Poster

 [Actually, you know what, if people send me UK posters, I’ll make an online collection available to everyone myself … ]

Tagged , , , , , , ,

Off-topic: The Zombie Apocalypse

ZED EventsThe world ended on Friday evening. No, the Mayans didn’t get the date wrong ( that’s still to come at time of writing ) rather, as a birthday present, my brother-in-law treated me to a preemptive “end-of-days” courtesy of the amazing ZED Events.

Without going into too much detail, as the organising company likes to keep the story line, and the actual event under wraps to maintain the surprise/fear factor for new clients/victims, if you are a fan of Zombies and Zombie flicks you are cast as the hero/heroine in your very own version of “28 Days Later” or “The Walking Dead” ( although in our case “Death Valley” might have been marginally closer to our overall competence levels !). Over a four hour period we fought our way through a three level shopping center (“mall” for both correct web searching – ZED Events call it a “mall” – and for our American readers).

In the first couple of hours you are guided, in both the scenario and in path finding, by a couple of armed response Police who’s mission was to secure the shopping center (job well done guys … not) – you learn a good deal of what has been happening, get basic training in your weapons (pump action airsoft shotgun) and tactics (good, honest, Monty Python – “RUN AWAY !”). For the second two hours you are more or less on your own with mission objectives, not at all dissimilar to those you might imagine from video games ( collect these things, drop them there, find so-and-so, etc. ) culminating in a heart pounding finale.

I had a blast. It’s scary, not so much in the blood, guts and gore – although they were excellent, decomposing, oozing and generally disgusting – but far more in the “Oh s**t !” – they are too fast, too many and know their way around way too well in comparison to the living. Running, hiding and frantically wondering if you have enough ammunition left are all part of the game – the objective being survival, rather than body count.

At the end of the day, we were allowed to meet our nemesis in the safe confines of the coffee room for photos – although they remained in character ( if not actively trying to chew our necks ) through out the “after party”.

Herewith please find “Bobo” and victim: ( Bobo is the one on the right … )

I know that the shopping mall in Reading is slated for demolition, but at the moment there is no firm date for that, however ZED Events run a similar scheme further north in a manor house, with considerably more land to run around in ( not that I couldn’t get lost in a three storey shopping mall as it was ! ).

I can’t recommend it enough – go with friends it’s great. You don’t need a great deal of fitness – cardio isn’t my thing in the gym – although I’d suggest those with heart conditions avoid it !

 

Tagged , , , , ,

How to automate Twitter and make a bit of a tit of yourself at the same time …

Free twitter badge

Free twitter badge (Photo credit: Wikipedia)

Oh dear, oh dear – post in haste repent at leisure ! ( If you don’t know what I’m talking about – see here. ) I’m glad to say that I recently read a book1 on business that suggested that an Agile approach ( release early, release often, and fix your bugs as you go along ) was definitely  the way towards successful business, so I’m going to imply that I did it on purpose.

So what’s gone wrong ? I can see that the script ( twitter.py -r )is running fine from cron ( /var/log/cron – it appears to run every five minutes ) – I know that if I run it from the command line within 5 minutes of creating the schedule that it works, that implies that the logic in the program ( if badly written ) is at least ok … So where is the issue occurring ? I thought initially that it was a path problem – I guess that my fault so far is that I’ve not made any effort to capture any errors. Ok, so I’ll give that a go … Great, nada being reported by cron. That’s not helpful.

Ah hah ! Got an error at last.

tweepy.error.TweepError: Status is a duplicate.

Whilst I can’t find any specific references to the error, it seems to me to be quite self explanatory. You can’t keep re-tweeting the same message – it needs to differ. That explains why the HootSuite interface was such a pain in the neck as they offload this onto the user to populate their CSV file with. I guess that the outstanding question then is “How much does a Tweet need to differ by _not_ to be considered a duplicate ?” by definition this should be a single char, so, for my 24 scheduled tweets I need to create 24 unique chars to add to the tweet. The simplest way would be to either count up or count down, this would hopefully give sufficent change to be different, as well as indicating easily to me, if not the casual observer, how far through the re-Tweet lifecycle it currently is.

The code now reads as follows:

#!/usr/bin/env python
##########################
# Python Auto Re-Tweeter #
# (C) Simon Biles 2012   #
# http://www.biles.net   #
##########################
# Version 0.01 -         #
# A first stab at it !   #
##########################
# Version 0.02 -         #
# A working version !    #
##########################

# All those tasty Python imports
import argparse
import datetime
import struct
import sys
import tweepy

from ConfigParser import SafeConfigParser

# Get the command line arguments
parser = argparse.ArgumentParser(description='Regular Tweet Generator.')
parser.add_argument('-s','--schedule', action='store_true', help='Schedule a Tweet for the next 7 days')
parser.add_argument('-r', '--run', action='store_true', help='Run the schedule')
parser.add_argument('-u','--update', action='store_true', help='Update Status Tweet immediately')
parser.add_argument('tweet', nargs='?')
args = parser.parse_args();

# Global variable
time_fmt = "%Y-%m-%d %H:%M"

# Get the config file data
parser = SafeConfigParser()
parser.read('twitter.conf')
CONSUMER_KEY = parser.get('consumer_keys','CONSUMER_KEY')
CONSUMER_SECRET = parser.get('consumer_keys','CONSUMER_SECRET')
ACCESS_KEY = parser.get('access_keys','ACCESS_KEY')
ACCESS_SECRET = parser.get('access_keys','ACCESS_SECRET')
FILE_NAME = parser.get('file_name', 'SCHEDULE_FILE')

# Main body

# Quick Command Line Update
if args.run == False and args.schedule == False and args.update == True:
   auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
   auth.set_access_token(ACCESS_KEY, ACCESS_SECRET)
   api = tweepy.API(auth)
   api.update_status(sys.argv[1])
   sys.exit()
# Schedule a Tweet by adding it to the schedule file
elif args.run == False and args.schedule == True and args.update == False:
   file_obj = open(FILE_NAME, 'a')
   current = datetime.datetime.now()
   nexttweet = current;
   count = 0
   while (count < 24):
      diff = datetime.timedelta(hours=count)
      nexttweet = nexttweet + diff
      tweettime = nexttweet.strftime(time_fmt) + " " + str(count+1) + "/24 " + args.tweet +"\n"
      file_obj.write(tweettime)
      count = count + 1
   file_obj.close 
   sys.exit() 
# Parse the schedule file and see if anything should have happend within 5 minutes of now.
elif args.run == True and args.schedule == False and args.update == False:
   file_obj = open(FILE_NAME, 'r')
   auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
   auth.set_access_token(ACCESS_KEY, ACCESS_SECRET)
   api = tweepy.API(auth)
   current = datetime.datetime.now()
   baseformat = "16s 1x"
   for line in file_obj:
      line = line.rstrip('\n')
      numremain = len(line) - struct.calcsize(baseformat)
      lformat = "%s %ds" % (baseformat, numremain)
      tweettime, tweet = struct.unpack(lformat, line)
      linetime = datetime.datetime.strptime(tweettime, time_fmt)
      delta = linetime - current
      if delta <= datetime.timedelta(minutes=5) and delta >= datetime.timedelta(minutes=-5):
         if delta <= datetime.timedelta(minutes=5):
            api.update_status(tweet)
   file_obj.close
   sys.exit()

So there you have it, a working version ! I’ve watched 10/24 Tweets fly by over the weekend, and the other 14 will play out over the next week and a bit – I must admit though that it is a bit front loaded at the moment, and behaves a little “spamily” for my liking. I think before I unleash it again, I might start it off at 6 hour intervals and let it grow from there for 18 Tweets. I’m thinking of how to track it’s success, and I have an idea, but more of that later !

 


1. The book is question was ReWork: Change the Way You Work Forever. Which I rather enjoyed, it was short and to the point – I don’t think that it is necessarily a “how-to” guide, but did get me thinking about a few things and gave me some inspiration to go out on the web and make a tit of myself like this ;-)

Tagged , , , ,

How to Automate Twitter – continued …

English: Python logo Deutsch: Python Logo

English: Python logo Deutsch: Python Logo (Photo credit: Wikipedia)

Pre-warning – I wrote this pretty late at night for me, and it doesn’t actually work at the moment – consider this an Agile release process …

Ok, so here is version 0.01 of the automated Twitter utility (I will be using it to re-publicise this blog entry, along with a couple of others – so this could either be a brilliant advert or a dire warning!1). I’ve changed the frequency criteria somewhat from the original, there are now 24 re-tweets with an hours increase in delay between each ( e.g. tweet, 1 hour, 2 hours, 3 hours up to and including 24 hours – overall this spreads out 24 tweets over 11 days or so ) – I’ll give that a go, and maybe experiment from there. I’ve also included a schedule file reference into the configuration file so that it is easier to change should it be neccesary. The overall lack of semi-colons and brackets has given me a nervous twitch, but other than that, given that I’m pretty new to Python, that was – all in all – not a terrible experience !

#!/usr/bin/env python

##########################
# Python Auto Re-Tweeter #
# (C) Simon Biles 2012   #
# http://www.biles.net   #
##########################
# Version 0.01 -         #
# A first stab at it !   #
##########################

# All those tasty Python imports

import argparse
import datetime
import struct
import sys
import tweepy
from ConfigParser import SafeConfigParser

# Get the command line arguments

parser = argparse.ArgumentParser(description='Regular Tweet Generator.')
parser.add_argument('-s','--schedule', action='store_true', help='Schedule a Tweet for the next 7 days')
parser.add_argument('-r', '--run', action='store_true', help='Run the schedule')
parser.add_argument('-u','--update', action='store_true', help='Update Status Tweet immediately')
parser.add_argument('tweet', nargs='?')
args = parser.parse_args();

# Global variable

time_fmt = "%Y-%m-%d %H:%M"

# Get the config file data

parser = SafeConfigParser()
parser.read('twitter.conf')
CONSUMER_KEY = parser.get('consumer_keys','CONSUMER_KEY')
CONSUMER_SECRET = parser.get('consumer_keys','CONSUMER_SECRET')
ACCESS_KEY = parser.get('access_keys','ACCESS_KEY')
ACCESS_SECRET = parser.get('access_keys','ACCESS_SECRET')
FILE_NAME = parser.get('file_name', 'SCHEDULE_FILE')

# Main body

# Quick Command Line Status Update

if args.run == False and args.schedule == False and args.update == True:
   auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
   auth.set_access_token(ACCESS_KEY, ACCESS_SECRET)
   api = tweepy.API(auth)
   api.update_status(sys.argv[1])
   sys.exit()

# Schedule a Tweet by adding it to the schedule file

elif args.run == False and args.schedule == True and args.update == False:
   file_obj = open(FILE_NAME, 'a')
   current = datetime.datetime.now()
   nexttweet = current;
   count = 0
   while (count < 24):
      diff = datetime.timedelta(hours=count)
      nexttweet = nexttweet + diff
      tweettime = nexttweet.strftime(time_fmt) + " " + args.tweet +"\n"
      file_obj.write(tweettime)
      count = count + 1
   file_obj.close 
   sys.exit()# Parse the schedule file and see if anything should have happend within 5 minutes of now.elif args.run == True and args.schedule == False and args.update == False:
   file_obj = open(FILE_NAME, 'r')
   auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
   auth.set_access_token(ACCESS_KEY, ACCESS_SECRET)
   api = tweepy.API(auth)
   current = datetime.datetime.now()
   baseformat = "16s 1x"
   for line in file_obj:
      line = line.rstrip('\n')
      numremain = len(line) - struct.calcsize(baseformat)
      lformat = "%s %ds" % (baseformat, numremain)
      tweettime, tweet = struct.unpack(lformat, line)
      linetime = datetime.datetime.strptime(tweettime, time_fmt)
      delta = linetime - current
      if delta <= datetime.timedelta(minutes=5) and delta >= datetime.timedelta(minutes=-5):
         api.update_status(tweet)
   file_obj.close
   sys.exit()

I think that it works ok, I’m going to carry out testing in live – like all good practice guides suggest you shouldn’t ! I’m going to set it to run with the -r command line switch from cron every five minutes.

There are one or two features that I think that I should look into in the near-ish future:

  • Cleaning up the schedule file – obviously it is only going to get longer and longer and thus the program will consume more and more resources as it tries to parse it.
  • I’d like to automate the script so that it monitors my Twitter account for updates from WordPress, and then adds those to the schedule immediately – I’m lazy you see …
  • I’m sure that there are a few better ways of doing things in there, and also there could be a little more in the way of commentary and instruction ( there are also, frankly, one or two bits that work, but that I don’t understand how ! )

Ah well, onward and upward eh ;-)


1. Ok, it’s a dire warning … It worked on the command line, honest !

UPDATE: If you decide to go live and test there, you then spend time hurriedly chasing down the bugs in your code which you just posted on your blog so as not to look like a complete berk … You have been warned !

UPDATE 2: Hmmm … Not _actually_ working, that’s a bit lousy … Ah, wait. I think I know what the problem is ! You need to specify the full path to the file in the config as cron doesn’t run in the same path ! Right take 3 !

UPDATE 3: Ok, that didn’t work … Back to the drawing board. Unfortunately I don’t have time now :-( so I’ll have to come back in another post later … Arrrrggghhh !

Tagged , , , , , , , , ,

NWrap Version 0.05 – NMap Wrapper with OPRP Database Dump

 

sshnuke hack in Matrix II 03

sshnuke hack in Matrix II 03 (Photo credit: guccio@文房具社)

 

Just a quick post, a few years ( in 2004 ! ) ago, I wrote a Perl wrapper on behalf of ISECOM for NMap that incorporates the data from the Open Protocol Resource Database (OPRP). It was featured in Professional Pen Testing for Web Applications by Andres Andreu, which was nice. However, it hasn’t been updated since then, and the ISECOM page has some issues with the OPRP download. I just thought I would (a) check that it still works and (b) bring it up to date if it doesn’t … Here, first, a quick example of it running: ( first without, and then with NWrap ).

 

[root@perl ~]# nmap localhost
Starting Nmap 5.50 ( http://nmap.org ) at 2012-07-17 17:50 UTC
Nmap scan report for localhost (127.0.0.1)
Host is up (0.000018s latency).
Not shown: 999 closed ports
PORT STATE SERVICE
22/tcp open ssh
Nmap done: 1 IP address (1 host up) scanned in 0.11 seconds
[root@perl ~]# ./nwrap.pl localhost
#########################################
# nwrap.pl - Nmap and OPRP combined !   #
# (C) Simon Biles TS Ltd. '04           #
# http://www.isecom.org                 #
# http://www.thinking-security.co.uk    #
#########################################

Starting Nmap 5.50 ( http://nmap.org ) at 2012-07-17 17:50 UTC
Nmap scan report for localhost (127.0.0.1)
Host is up (0.000018s latency).
Not shown: 999 closed ports
PORT STATE SERVICE
22/tcp : open 
 - Adore worm 
 - SSH 
 - Shaft DDoS
Nmap done: 1 IP address (1 host up) scanned in 0.10 seconds
[root@perl ~]#

 

Now for the code:

 

#! /usr/bin/perl
# Nmap wrapper for OPRP data.
# (C) Simon Biles
# http://www.isecom.org
# Version 0.05
$version = "0.05";
# History - 0.01 Working version.
# 0.02 Changed use of ``s for output to opening a pipe.
# 0.03 Use the OPRP database dump directly, not through
# pre-parsed file
# 0.04 Included output switches and file writing stuff
# 0.05 Updated for CSO to TS name change and checked working (2012)
# OPRP Dump file has changed to HTML, converted to CSV and
# rewrote parser to work with CSV.# Read in from the OPRP data file created earlier.
# and fill in an internal table.
# Give us a little credit :) and show that it is running ...
print "\n#########################################\n";
print "# nwrap.pl - Nmap and OPRP combined ! #\n"; 
print "# (C) Simon Biles TS Ltd. '04 #\n";
print "# http://www.isecom.org #\n";
print "# http://www.thinking-security.co.uk #\n";
print "#########################################\n\n";
%services=();
open (DATA, "< oprp_services_dump.csv");
# New CSV parser code
while (){
# Split the data at comma separations
 ($port_no,$port_type,$name,$reference) = split(/,/, $_);
if ($port_type =~ /^UDP/){
 $port_prot = $port_no."/udp";
 push( @{$services{$port_prot}},$name);
 }
 elsif ($port_type =~ /^BOTH/){
 $port_prot = $port_no."/tcp";
 push( @{$services{$port_prot}},$name);
 $port_prot = $port_no."/udp";
 push( @{$services{$port_prot}},$name);
 }
 elsif ($port_type =~ /^TCP$/){
 $port_prot = $port_no."/tcp";
 push( @{$services{$port_prot}},$name);
 }
 elsif ($port_type =~ ""){
 $port_prot = $port_no."/unknown";
 push( @{$services{$port_prot}},$name);
 }
}# Just to keep things tidy !
close DATA;
# There are some output to file arguments that I hadn't thought about !
# Check for them here and set up some variables ...
# They then are pulled from the arguments so that we can do the output ...
# If more than one output option is specified ( which I'm not sure is legal anyway )
# the final switch will take priority
for($i = 0;$i < @ARGV;$i++){
 if (@ARGV[$i] =~ m/-o/){
 if (@ARGV[$i] =~ m/-oN/){$out_normal = 1; $out_xml = 0; $out_grep = 0; $arguments = $arguments." -oN - "; $i++; $filename = @ARGV[$i];}
 if (@ARGV[$i] =~ m/-oX/){$out_xml = 1; $out_normal = 0; $out_grep = 0; $arguments = $arguments." -oX - "; $i++; $filename = @ARGV[$i];}
 if (@ARGV[$i] =~ m/-oG/){$out_grep = 1; $out_xml = 0; $out_normal = 0; $arguments = $arguments." -oG - "; $i++; $filename = @ARGV[$i];}
 } else {
 $arguments = $arguments.@ARGV[$i];
 }
}
# O.k. ... So if there is a file specified, we had better open it to write to ...
if ($out_normal == 1 || $out_xml == 1 || $out_grep == 1){
 open(OUT,"> $filename") or die "Can't open $filename to write to ! $! \n";
}
# Run nmap with the provided command line args.
# doing it this way rather than with backticks, means that the output is "live"
open(NMAP, "nmap $arguments |") or die "Can't run nmap: $!\n";
# If necessary warn the user that they shouldn't expect to see any output ...
if ($out_xml == 1){
 print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!\n";
 print "! Sorry. The XML output option only !\n";
 print "! ouputs to the filename specified !\n";
 print "! not to the screen. !\n";
 print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!\n";
}
if ($out_grep == 1){
 print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!\n";
 print "! Sorry. The Grep output option only !\n";
 print "! ouputs to the filename specified !\n";
 print "! not to the screen. !\n";
 print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!\n";
}
# Modify the output as required.
while(){
 if ($out_normal == 0 && $out_xml == 0 && $out_grep == 0){
 if ($_ =~ m/(^\d+\/)(tcp|udp)/){
 ($port,$state,$service)= split (/\s+/, $_);
 print "$port : $state \n";
 foreach $service ( sort @{$services{$port}}){
 print " - $service \n";
 }
 } else {
 print $_;
 }
 } elsif ( $out_normal == 1 && $out_xml == 0 && $out_grep == 0){
 if ($_ =~ m/(^\d+\/)(tcp|udp)/){
 ($port,$state,$service)= split (/\s+/, $_);
 print "$port : $state \n";
 foreach $service ( sort @{$services{$port}}){
 print " - $service \n";
 }
 print OUT "$port : $state \n";
 foreach $service ( sort @{$services{$port}}){
 print OUT " - $service \n";
 }
 } else {
 print $_;
 print OUT $_;
 }
 } elsif ( $out_xml == 1 && $out_normal == 0 && $out_grep == 0){
if ($_ =~ /port /){
 $_ =~ s/\/ /g;
 $_ =~ s/\"//g;
 (@array) = split (" ",$_);
 foreach (@array){
if ($_ =~ m/portid/){
 ($a, $port) = split ("=",$_);
 }
 if ($_ =~ m/state/){
 ($a,$state) = split ("=",$_);
 }
 if ($_ =~ m/protocol/){
 ($a,$protocol) = split ("=",$_);
 }
 if ($_ =~ m/conf/){
 ($a,$conf) = split ("=",$_);
 }
 if ($_ =~ m/method/){
 ($a,$meth) = split ("=",$_);
 }
 }
 $port_prot = $port."/".$protocol;
 foreach $service ( sort @{$services{$port_prot}}){
 print OUT "\\n";
 }
 } else {
 print OUT $_;
 }
 } elsif ( $out_grep == 1 && $out_normal == 0 && $out_xml == 0){
# This is all one bloody long line, so this should be fun ...
# Send the comments stright through ...
 if ( $_ =~ /^\#/ ){
 print OUT $_;
 } else {
 @array = split(",",$_);
 for($i=0;$i < @array; $i++){
 if(@array[$i] =~ /Host:/){
 ($a,$host_ip,$host_name,$b,$remainder)= split(" ",@array[$i]); 
 @array[$i] = $remainder;
 }
 if(@array[$i] =~ /Ignored/){
 ($port_data,@therest)= split(" ",@array[$i]);
 @array[$i] = $port_data;
 }
 }
 print OUT "$a $host_ip $host_name $b ";
 foreach (@array){
 $_ =~ s/\// /g;
 $_ =~ s/\,//g;
 $_ =~ s/\s+/:/g;
 ($nada,$port,$state,$protocol,$name) = split(":",$_);
 $port_prot = $port."/".$protocol;
 foreach $service ( sort @{$services{$port_prot}}){
 print OUT "$port/$state/$protocol//$service///,";
 } 
 }
 print OUT " ".join(" ",@therest)."\n";
 }
 }
}
# Tidy up the open files ... if they exist ...
if ($out_normal == 1 || $out_xml == 1 || $out_grep == 1){
 close OUT;
}
# That's it really !

 

In order to make it work you’ll need to download the CSV file of the OPRP database here.

 

Incidentally, if you are interested in Port Scanning and Penetration Testing and the like, you might find this series on Forensic Focus interesting.

 

Tagged , ,

How to Automate Twitter – a bit at least !

Perl

Perl (Photo credit: Wikipedia)

I’ve been trying to push up the readership of the blog here ( and get some people to stick around a bit, subscribe, follow on twitter etc. )  I’m not a Facebooker – I do have an account ( or two … ) but they contain nothing much of interest, they were created in order to investigate how FB worked, rather than anything else, so I’m not exactly the stereotypical user ! I make use of LinkedIn and Twitter as my online “social” tools and I’ve not graduated beyond that. The trouble is, I believe, in the transient nature of Twitter – I Tweet and it disappears off the bottom of the screen in seconds as other’s posts come in and push it down. I’ve watched for a while, and it seems that the “successful” Tweeters post their links frequently – keeping them in view for a longer period of time.

Now, I have to admit that I am lazy, but also geeky – I want to post a tweet advertising the blog frequently, but without user interaction. I’m sure that people will pop-up and tell me of things that automagically do this for me – HootSuite springs to mind – but having used it, it has already upset me with it’s scheduling system – the CSV upload is a pain, and, as of yet, I’ve not managed a single one without an error. Sooo, as I spent a while ago messing around with Twitter and Perl, I thought that the easiest way forward might just be to write my own.

For want of a better methodology, as I intend to post once a week, I want each entry alerted on immediately, and then in increasing intervals until the next post is due out ( 1 week hence ). I don’t want mid-term posts to reset the last weeks worth, but if it is relevant ( like I hope all posts are !) then I do want to publicise it for a full week as well. I’ve tried a couple of exponential increases, (2 * last period, 1.5 * last period ), but to be honest as I’m sure you can imagine, it gets up to over a day fairly quickly … (Google “exponential” if you want to know more !)  So I’m going to say day one is once every two hours, day two is once every three hours, day three once every four hours, day four is four times in the day, day five is three times a day, day six is twice, and just the once on the seventh day – heck, if God can take a rest, so can our program ! That gives us a total of 36 Tweets, weighted towards the start whilst the post is fresh and tailing off as the new post comes along.

As always stated with my programming posts, I’m not a programmer, any similarity to programmers living or dead is entirely coincidental. I like programming in Perl, because, not only is “there more than one way to do it”, I can usually figure out at least one of those particular permutations – elegant as my solution may not be … [ if you want to see elegant programming – and the output of the man that I go to when I get stuck – have a look over here. Shamefully he wastes his time in the world of Microsoft, but we forgive him a lot ;-) ]

It turns out, much to my annoyance that the authentication methods that I was using in the “Hacking around with Twitter” is no longer valid. It seems that I now need to use OAuth1 … However, after several hours of buggering around with it I failed completely to get it to work. So back to the drawing board there …

Python anybody ?

English: Python logo Deutsch: Python Logo

English: Python logo Deutsch: Python Logo (Photo credit: Wikipedia)

I’ve been meaning to get cracking with Python for some time. I was a die hard Perl fan until the day I saw the graphs that came from matplotlib – I was taken by the quality and professionalism of them, and I immediately spent far more money than can be considered sensible on all sorts of Python books so that I too, could make maths and art become one and the same thing. It seems though that I have the same level of programming ability as a garden slug when it comes to moving languages, and the same sort of speed of movement. It took me three years (ish) at university to learn C [ and ML and Prolog – but let’s be honest, neither of those actually count as programming languages ] and it’s taken me countless years since to learn to threaten, coerce and cajole Perl to do my bidding at least 50% of the time.

This, then, is my forced introduction to Python – my baptism of fire ( although God only knows why, if I can’t do it in Perl I stand the least bit of chance in Python ! ). And, not only that, I’m going to push it out here for your ridicule and derision.

Another day, I’d like to walk through the Rackspace cloud with you, but that’s for another day – let us just say, that I quickly threw up an Fedora 15 (Lovelock) instance to play with, and was deeply relieved that Python appears to be a standard part of the distribution. For reference my development environment also consists of Komodo Edit, which is excellent, with supported syntax highlighting for both Perl and Python ( and HTML and C and C++ and … ) also, when correctly configured, is quite happy using scp to remotely edit files and browse remote directories.

I understand that the Python equivalent of CPAN is PyPI – the Python Package Index – and, after installing the package, I’ve used that to install the Tweepy library. I’m not going to repeat the guidance on creating a new application in either (both!) of the blog links below – what I will say though is that you should remember to set your application settings to Read and Write – otherwise it won’t work ;-)2

I’ve split the examples out so that there is a config file that holds the various keys. It’s format is as follows:

[consumer_keys]
CONSUMER_KEY = consumer_key_here
CONSUMER_SECRET = consumer_secret_here
[access_keys]
ACCESS_KEY = access_key_here
ACCESS_SECRET = access_secret_here

Obviously insert your own, hard earned keys in here – no inverted commas or anything they get parsed in a minute with ConfigParser. [ Basically, I couldn’t go through the rest of this worrying about accidentally publishing my keys every five minutes. ]. I used the script provided in the example to do this, although it seems that you can generate these keys for your own Twitter account in the developer section of the site without going through the pain or the learning experience.

I’m getting worried how long this post is getting – especially after a discussion with a young man the other day who said that his dissertation was 5000 words only and I’ve written a 5th of that ! – so below is the remainder of the sample code for a command line client, this takes text after the command ( contained in ‘ ‘ ) and updates your status with it ( e.g. ./twitter.py ‘It lives!’ ):

#!/usr/bin/env python

import sys
import tweepy
from ConfigParser import SafeConfigParser

parser = SafeConfigParser()
parser.read('twitter.conf')

CONSUMER_KEY = parser.get('consumer_keys','CONSUMER_KEY')
CONSUMER_SECRET = parser.get('consumer_keys','CONSUMER_SECRET')
ACCESS_KEY = parser.get('access_keys','ACCESS_KEY')
ACCESS_SECRET = parser.get('access_keys','ACCESS_SECRET')

auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
auth.set_access_token(ACCESS_KEY, ACCESS_SECRET)
api = tweepy.API(auth)api.update_status(sys.argv[1])

I’ll write a second post within the next week to update the remainder with a full program to automate the remainder of the posting process – I want to get it running asap to be honest, as I think I’m missing out !


1. With thanks to David Moreno’s blog post on the issue as my starting point on OAuth for Perl, and perhaps the first and last bit of it that I understood ! And Jeff Miller’s blog post for the Python equivalent.
2. Which may well be why I couldn’t get the darn Perl version to work, I realise now. However, a kick in the pants, is a kick in the pants for whatever reason it comes …

Tagged , , , , , , , , , , ,

Elgato game capture HD – A quick review …

It’s been a bit of a perfect storm – the step up in blogging, my son’s birthday request and a review in the times – all within a few weeks of each other – has led me to buy an Elgato game capture HD. I wasn’t convinced that, as a life necessity, the need to record a kill streak on MW3, or, more to the point my son knifing me in the back yet again was really that great a plan – but when I discovered that this device was happy to accept any HDMI input, it started to gain ground. For some time I’ve been using the HDMI output from my MacMini, and a couple of laptops ( as well as the PS3 ) as the standard video/audio connection rather than faffing around with other connections, usually with far less successful results, so, as I had been thinking about capturing some video tutorials for the “Introduction to PenTesting” series that I’m running on Forensic Focus the Elgato gcHD seemed to be quite an interesting fit to the problem.

English: A standard HDMI connector for hooking...

English: A standard HDMI connector for hooking up audio/visual equipment. (Photo credit: Wikipedia)

Anyhoo … I wouldn’t go as far as to say that it’s hard to get hold of – Amazon is Amazon as always, however there isn’t exactly a crowd of people trying to sell you one of these, so perhaps it is rather a niche market. There was a rather long lead time predicted ( 2 to 4 weeks ) although this seems to have disappeared, and, mine didn’t take that long to arrive in any case. When it did get here, I must admit to being rather surprised at how small it is. I don’t know what had given me the impression that it would be more substantial, but it is maybe 3.5″ by 2″ by 1″ of rounded shiny black plastic. Feels substantial enough. What was a pleasant surprise was that it comes with all the required cables – a short HDMI, an PS3 special cable, a composite cable and a USB cable – everything that you need as it requires no additional power beyond that drawn from the ports.

Connection is easy enough, provided you read the manuals – the device won’t work with the PS3 HDMI out for example, only the special cable – which did cause a little head scratching ! However that’s not what I was interested in – and I’ll let my son write and post a gaming perspective guest review shortly – for me I couldn’t wait to try it with a PC HDMI connection. My quick test rig is a Sony Vaio running Windows 7 with an HDMI out and my MacBook Pro taking the USB feed. The software isn’t included in the package – but is only a quick download from the Elgato website. There are both Mac and Windows versions ( so I may try this the other way round before long – I have a Mac MiniDisplayPort to HDMI adaptor cable ) and it took no time at all to download and install. The software isn’t very “feature rich”, but it is straightforward and easy to use.

Basic Recording Screen for Elgato game capture HD on MacOS X

There are a number of built-in submission tools for the likes of YouTube, FaceBook, Twitter but also for conversion to various “i” formats ( iPad, iPhone), e-mail, ProRes ( whatever that is … ) and just dumping them in your movies folder. I’ve not experimented with these yet, but I might have a go with the Twitter one at a later date. I think that the boy has ideas ( partially because I pointed them out to him when the times article quoted a £60,000 a month profit of online games review sites ) of uploading some things onto YouTube, again, I’ll let him address the effectiveness of this in his review.

Basic Editing Screen for Elgato game capture HD on MacOS X

Basic Editing Screen for Elgato game capture HD on MacOS X

I wasn’t that impressed with the editing features of the app, which are, to say the least, basic. I’m a long term Mac user though, and thus am not afraid to nip into the App store and put down some hard earned cash on an “i” app – a quick “iMovie” purchase later ( less than £11 ) and I was able to import the .m4v file for some more capability in post-production as it were.

All in all I’m impressed, I’ll update this as time goes on and there are actually some videos created that I’ve uploaded and might – very might – actually contain some real content …

Until such time, check this one out – this is the Mac with HDMI out, feeding back into itself …

Tagged , , , , , , , , , ,

Guest Blog Entries

The new British Computer Society logo

The new British Computer Society logo (Photo credit: Wikipedia)

the tarsier featured on the cover of Learning ...

the tarsier featured on the cover of Learning the vi Editor has been incorporated into the O’Reilly logo (Photo credit: Wikipedia)

I’ve been very lucky and had the opportunity to write a couple of guest blog entries over the last few days, you can have a look on these links.

http://www.bcs.org/content/conBlogPost/2067 – Is there a place for fear, uncertainty and doubt in security?

and

http://www.josetteorama.com/books/why-should-i-buy-a-book/ – Why should I buy a book ?

Thank you very much to Josette Garcia of O’Reilly (@josettegarcia) and the British Computer Society for letting me write for them !

Tagged , , ,

It’s all about managing risk …

Well, what an interesting turn up for the books, it seems that a group of CISOs have gotten together, in what I am sure were challenging circumstances ;-), in Hong Kong to figure out that Information Security isn’t all about technology ( http://www.theregister.co.uk/2012/04/25/ciso_advice_risk_management/ ) … I’d like to take this opportunity to point you to my article on “What is ‘good enough’ information security ?” (http://articles.forensicfocus.com/2011/09/19/what-is-good-enough-information-security/) from a while ago.

I re-iterate – as security consultants we are risk managers – it needs to be fit for purpose – not a technical solution to a problem that doesn’t exist !

Tagged , , , , , ,

Libyan Money …

To the General Public: Please delete as applicable.

Dear (Sir/Madam/Undecided/Both),

Please let me introduce myself, I am not a former (banker/government official/religious leader) of (Libya/any other nation) where a dubious infrastructure might allow for millions of (pounds/dollars/euros) to become available to me, nor am I the (son/daughter/niece/nephew/2nd cousin twice removed) of an (oil/gas/diamond/other) (millionaire/billionaire/smuggler/other), I am in fact a person who (works for a living/is a security consultant/blogs) and is (honest/trustworthy/not directly after your cash), I am not going you to offer you untold riches _but_ I _am_ going to offer you an opportunity to (hold on to what you have/not look like a complete berk/not fund criminals &&|| terrorists).

Listen to me carefully – I realise that this letter may be less believeable than the ones you normally recieve, partially because the grammar and spelling is close to correct, but also because it doesn’t offer you, someone who has no rhyme or reason to expect it, something for nothing.

I don’t wish to undermine your confidence in any way, however, you haven’t been singled out as an example of upstanding public decency or because of your obvious intellegence – rather because somewhere, at some time, your e-mail address made it onto the public internet – they don’t even know who you are – they have spammed you, along with 100,000,000 other people in the hope of a 0.1% return on their targets.

If you answer their e-mail, you will be strung along in a classic scam where, over the promise of a large sum of money, they will get you to send them larger and larger (management fees/bribes/charges) until, at last, you have none of your money and they have all of it.

I’ll do you a deal – if it makes you feel better – you can all send me (£10/$10/E10) and I’ll pretend that I’ve got something for you for 10 minutes – I’ll even send you an e-mail with an excuse why it hasn’t been transfered to you immediately – but we’ll call it quits there. I’ll give your money to the NSPCC, and you can keep the rest, and the criminals get none. What do you say ?

I write this now, because, as is the way of the world, when an event happens, the (less ethical/criminals/scum) take advantage of the (more gullible/less savvy/dumb/unfortunate), and I’ve a vain hope that perhaps this might stop just one person falling for it.

The general rule of the world is – “if it sounds too good to be true, it probably is” – if you have any doubt about something why not drop me a line, and I’ll have a look at it for you – if it is true, I’ll ask for my (£10/$10/E10) for the NSPCC and you can keep the rest – if it’s not, which I’m willing to bet on, then you’ve saved yourself a fortune !

Kind Regards,

Si

Follow

Get every new post delivered to your Inbox.

Join 381 other followers