Don’t blame sendmail^H^H^H^H^H^H^HFirefox.

On Firefox performance,

For the last couple of months I have been plagued by real bad firefox performance. My colleagues kept harping about ditching this old browser and switching to chromium, the latest sexiest thingie. But you see I am an old dog and I really like to keep good software, so I tried to follow every blog site about firefox performance tuning. The eye opener was about:addons which showed that LastPass was “probably problematic”.

Well I disabled it and lo and behold Firefox is no longer using two of my ‘top’s CPUs at full throttle.

So, don’t blame Firefox! Remember:  the net elders did not blame sendmail either!

Advertisements

Dev. Defense

Customers, internal or external for that matter, regularly ask for one thing , imagine another and actually need a third. How can a developer defend oneself before such customers ? Use this; really.

  1. Please tell me what is expected of this product. Better yet draft it on a napkin.
  2. I have limited telepathy which decreases exponentially by distance. Please communicate succinctly via other means.
  3. I did not think of the other 400 possibilities you could have used  because my creativity is limited by my imagination. I never imagined this, or that for that matter.

Artifactory Stats via PERL

#!/usr/bin/perl
#
# Glean data from artifactory
#

use JSON;
use Data::Dumper;

# using api key for akarageo, fix as necessary
$CREDS=”user:apikey”;
$API=’https://artifactory.mydomain.com/artifactory/api’;

%MAG = (bytes => 1 ,
KB => 1024,
MB => 1024*1024,
GB => 1024*1024*1024,
TB => 1024*1024*1024*1024);

# if you want to sort by percentage
sub byperc {
$a->{percentage} <=> $b->{percentage} || $a->{filesCount} <=> $b->{filesCount}
}
# if you want to sort by storage size

sub bysize {
# convert to numbers and then compare
my ($bytes,$mag)=split(‘ ‘,$a->{usedSpace});
$a->{realSpace}=$bytes * $MAG{$mag};

($bytes,$mag)=split(‘ ‘,$b->{usedSpace});
$b->{realSpace}=$bytes * $MAG{$mag};

$b->{realSpace} <=> $a->{realSpace} || $a->{filesCount} <=> $b->{filesCount} ||   $a->{itemsCount} <=> $b->{itemsCount}
}

# the most interestinf part
sub printrepositoriesSummaryList {
my ($perl_scalar)=@_;
# data size page
print “<h2>Repositories Summary List</h2>\n”;
print “<table><tr>\n<th>Repository Name</th><th>Percentage</th><th>Used   Space</th><th>Folders</th><th>Items</th><th>Files</th></tr>\n”;
foreach $arr ($perl_scalar->{repositoriesSummaryList}) {
my @sarr = sort bysize @$arr;
foreach $hashref (@sarr) {
print “<tr>”;
print “<td>”.$hashref->{‘repoKey’}.”</td>”;
print “<td>”.$hashref->{‘percentage’}.”<ctd>”;
print “<td>”.$hashref->{‘usedSpace’}.”</td>”;
print “<td>”.$hashref->{‘foldersCount’}.”</td>”;
print “<td>”.$hashref->{‘itemsCount’}.”</td>”;
print “<td>”.$hashref->{‘filesCount’}.”</td>”;
print “</tr>\n”;
}
}
print “</table>”;
}

# global stats
sub printfileStoreSummary{
my ($perl_scalar)=@_;
print “<h3>File Store Summary</h2>”;
print “<table><tr><th>Directory</th><th>Total Space</th><th>Used Spacee</th>  <th>Free Space</th></tr><tr>”;
print “<td>” .$perl_scalar->{fileStoreSummary}{storageDirectory} .”</td>\n”;
print “<td>” .$perl_scalar->{fileStoreSummary}{totalSpace} .”</td>\n”;
print “<td>” .$perl_scalar->{fileStoreSummary}{usedSpace} .”</td>\n”;
print “<td>” .$perl_scalar->{fileStoreSummary}{freeSpace} .”</td>\n”;
print “</tr></table>\n”;
}
# may be interesting some time
sub printbinariesSummary{
my ($perl_scalar)=@_;
print “<h2>Binaries Summary</h2>\n<table>\n”;
foreach $key (keys($perl_scalar->{binariesSummary})) {
print “<tr><td>$key</td><td>” .$perl_scalar->{binariesSummary}{$key} .”</td></tr>\n”;
}
print “</table>\n”;
}
#main Code
$content=`/usr/bin/curl -s -u ${CREDS} ${API}/storageinfo`;
$json = JSON->new->allow_nonref;
$perl_scalar = $json->decode($content);

#$pretty_printed = $json->pretty->encode( $perl_scalar ); # pretty-printing
#print $pretty_printed ;

print “<h2> Artifactory Statistics </h2>”;

printfileStoreSummary($perl_scalar);
#printbinariesSummary($perl_scalar);
printrepositoriesSummaryList($perl_scalar);

A sr. Geek’s rtfm to diet & exercise

I am in my 50s and I am a geek.I have always been a geek. I was a geek  before the term was invented, I was, and I have always been overweight too for the love of books.

It is as socially and physically hard to being geeky as it is mentally satisfying. I have been trying to bridge that gap all my life with various degrees of success by going to gyms and dieting and now that I am in my 50s I would like to share my latest findings with my peers. I will not even try to attempt to give advice to a younger crowd, they must give this fight by themselves, the chasm is too big.

Well then my fellow middle aged geeks,first of all Harq’ al Ada! Break the habit! This is the first step and it is pure psychology. We all get comfortable inside our private misconceptions such as, I am old and fat and the world should accept it because I am smart. Well partner the world cannot see within the brilliance of your mind,  just your musculature. And this is a good thing too because cardio vascular diseased behaves exactly like that. Like a snobbish work out instructor. If one does  not fill the eye of the gymnast one gets sick, terminally.

Break the habit of feeling bad for yourself too. Demand that you feel present in all aspects of life. Humans’ history is mostly about hunting down and killing animals (and each other on many occasions but that is another blog post).  There are certain parallels to hunting in the pursuit of sageness but only literate. Our physiology demands muscular work, painful muscular work. So to break the habit one must actively decide to change into a new mental set. ‘Decision‘ is the keyword here. Unless one fully commits to this decision one must not even try to start physical activity. It hurts!

And this leads me to the second and final  step really: embrace the pain. Geeks should feel right at home here. All their life has been painful: pain to obtain knowledge and pain to update mental toolsets, pain to be driven away from social acceptance, pain. How can someone whose second nature is pain not be able to control physical pain? Once I understood the essence of pain in my life I also understood what makes us geeks: The inability to live complacently.

Dieting hurts, working out hurts, physicality is hurtful, so is all life, so what? Hunger is painful and it is such a basic instinctual response that most of us geeks simply assuage it so that it does not interfere with our cogitation. Wrong! Embrace hunger, study it, feel it, use the Bene Geserit admonition against fear to control your hunger. You will be amazed how much of it is no more than bodily complacency. A geek is not complacent isn’t she / he ?

Now you have the toolkit to control hunger. Controlling hunger has also nice side effects: it makes one more assertive. One is no longer controlled by a base instinct but is in full knowledge of its handling. Using this same library of mental gears one can also control other social aspects such as abuse or manhandling, but this is an exercise left to the dieter.

A parallel thread running to dieting is physical exercise. Not the low impact kind but the kind that leaves you hurting for two days after you trained. I like weight lifting what non geeks call  body building. ( I had a lot of body to build  too: all 120Kg of it !. Work out people are right: unless you push yourself you get no results. Geeks know all about pushing themselves to get results, they have been doing this all their lives. Use this consciously now to get in shape. Shape your muscles, do not feel that it is base and unseemly for 21st Century people. Our physiology is that of plains’ apes and there is no shame to that.

There are mental  benefits to muscle building too. When muscles start building up the heart gets stronger too. A strong heart drives more blood to the whole body. More blood means more oxygen to the brain too. So one actually gets smarter after pumping iron. Also during the work out, as the brain quiets down, new neurons get a chance to be built much like in runners’ heads (I need to verify this claim). So you are getting smarter during the work out too. Pain at this point is essential. Pain is the trigger factor to muscle building.  Unless you hurt you are fooling yourself and have not broken the habits.

Use the pain and exhilaration of a good muscle bursting work out to float yourself above the pain. Observe like the geek you are how you change. You will observe not only the obvious physical changes, changes that all of lesser humanity aspires too, but also an easing of mental fatigue which in and by itself is well worth all the above trouble.

As us Greeks  say “νοῦς ὑγιής ἐν σώματι ὑγιεῖ

 

 

 

 

DDOSes aren’t that bad for the cloud after all.

First monday of October 2016 and a friends data center is under a DNS amplification attack attack against cpsc.gov. Having seen it all a number of times before, I was able to thwart in under 30 minutes flat, but it got me thinking!

How do you mitigate a DDOS attack on your service when you buy the service from a cloud provider ? Obviously the automated tools will spawn as many instances as are necessary to absorb the attack.

Of course the service will survive, the devops will pat themselves on their collective backs and the managers will gloat of the great team they run.

What about the cloud provider then ? Oh they get the sweetest part of the deal: each instance spawned nets them cash.

So am I worried about the cloud being targeted by DDOS attacks ? Heck no as long as I do not foot the bill !

Easily add parallelization to any (PERL) program

Here is the catch, suppose you have a little program that can naturally be parallelized. Suppose you have an array, and for each row of the array you want to execute a certain procedure but all the procedures can run in parallel. Here is a little PERL tidbit that can do just that. It spawns a thread per array row but in batches so you can keep tabs:-).
Imagine this working in tandem with memoize.
$par_tabs is a global that defines the maximum number of parallel tabs.

You need a sub per element like so:

sub per_entry_sub($$$$$){
     my ($game,$evtime,$href,$extraparam1,$extraparam2) = @_;
     ...
     ...
}

and of course you call the parallel excutor like so:

par_do_threaded(\&per_entry_sub,$extraparam1,$extraparam2,\%games);

The \&per_entry_sub is a reference to the sub itself and is used in the executor as
$func->(parameters);

 

 

Aaand here is the executor:

######################################################################
#
# Parallel Threaded execution, one thread per additional firefox TAB
#
######################################################################
sub par_do_threaded($$$$) {
    my ($func,$extraparam1,$extraparam2,$dataref)=@_;
    # dereference data hash
    my %data = %$dataref;
    #Now fetch each entry in parallel
    my $count=0;    
    print "*****************************************************\n";
    print "**                 ENTERING PARALLEL OPERATION     **\n";
    print "*****************************************************\n";
    my $numdata=scalar keys %data;
    print "NUMBER OF ENTRIES=$numdata\n";
    return if ($numdata<=0);
    my $entrynum=0;
    foreach my $key (sort keys %data) {
        $entrynum++;
        print "ENTRY NUMBER: $entrynum ENTRY $key\n";
        my $entry  = $key;
        my $href   = $data{$key}{'href'};
        my $evdate = $data{$key}{'date'};
        # do it in batches of $par_tabs
        if ( $count < ($par_tabs - 1) ) { $count++; my $tid=threads->create($func,$entry,$evdate,$href,$extraparam1,$extraparam2)->detach();
            print "THREAD $tid created\n";
        } else {
            # This is the linear execution part of the code
            $func->($entry,$evdate,$href,$extraparam1,$extraparam2);
            $count=0;
        }
    }
}

Docker testing without compose

Sometimes using docker-compose during testing can be a hassle. Use the following little magic to spawn and test multiple instances.
Your mileage will vary.

#!/bin/bash

#
# A semi automated way to launch multiple docker containers and test your apps parallelization
# replace the creation of index.php with your favorite git cloned code
# angelos@unix.gr
#

DOCKER=”sudo docker”
IMAGE=worker
DOCKERHOST=localhost:3000

function build {
echo ‘ ‘ > index.php

echo ‘
FROM centos:centos6
EXPOSE 80
RUN yum -y update && \
yum -y install epel-release && \
yum -y install mod_php
ENTRYPOINT [“/usr/sbin/apachectl”, “-D”, “FOREGROUND”]
‘ > Dockerfile

echo “Building Master Image”
$DOCKER build . 2>&1 | tee build.log
id=`grep ‘Successfully built’ build.log | cut -d” ” -f 3`
if [ “X${id}” == “X” ]
then
echo “build failed”
exit 1
fi

$DOCKER tag -f ${id} $IMAGE
}

function runem {
for instance in `seq 1 $1`
do
# might give an error
$DOCKER rm worker-instance${instance} >& /dev/null

hash=`$DOCKER run -d \
-p $((80+${instance})):80 \
-h worker-instance${instance} –name=worker-instance${instance} \
$IMAGE`
done

echo “======================= Docker Images ======================”
$DOCKER ps
}

function killem {
INSTANCES=`$DOCKER ps | grep worker-instance | awk ‘{print $11}’`
for instance in $INSTANCES
do
$DOCKER kill ${instance} && $DOCKER rm ${instance}
done
}

function checkem {
INSTANCES=`$DOCKER ps| grep worker-instance | wc -l`
if [ $INSTANCES -le 0 ]
then
echo “[ERROR] No Instances found ”
exit 1
fi
> usage.txt
> processes.txt
for instance in `seq 1 $INSTANCES`
do
curl -s http://localhost:$((80+${instance}))| grep worker-instance${instance} >& /dev/null
if [ $? -ne 0 ]
then
echo “Instance ${instance} is not healthy”
else
echo “Instance ${instance} is fine”
fi

echo worker-instance${instance} >> processes.txt
$DOCKER exec worker-instance${instance} ps aux >> processes.txt 2>&1

echo worker-instance${instance} >> usage.txt
$DOCKER exec worker-instance${instance} w >> usage.txt 2>&1
done

echo Process list is in processes.txt , mem/cpu usage in usage.txt
}

function remote {
curl -s http://localhost:2376/containers/worker-instance${1}/stats?stream=false | sed -e ‘s/[{}]/”/g’ | awk -v RS=’,”‘ -F: ‘{print $1 ” ” $2}’ | sed -e ‘s/\”//g’
}

function usage {
echo ”
Usage:
[build image] ./$0 -b
[run image instances] ./$0 -r
[delete image] ./$0 -d
[kill running instances] ./$0 -k
[check instances] ./$0 -c
[check instance via remote api] ./$0 -a

}

while getopts “:r:a:cbk” opt; do
echo “$opt was triggered, Parameter: $OPTARG” >&2
case $opt in
a)
remote $OPTARG
;;
d)
docker rmi -f $IMAGE
;;
k)
echo “Killing Instances”
killem
;;
r)
runem $OPTARG
;;
b)
build
;;
c)
checkem
;;
\?)
usage
exit 1
;;
🙂
echo “Option -$OPTARG requires an argument.” >&2
exit 1
;;
esac
done

Poor man’s xml parser for jenkins config

Angry jenkins is a nice tool but its users’ config is in XML. So what happens if on a production server one cannot install an XML parsing lib ?

Here is a little ditty to list users by role with the help of stingy global multiline regexp matching.

 

#!/usr/bin/perl
 $files=`ls /space/jenkins_instances/*/config.xml`;

print " JENKINS USERS by ROLE\n";

@files=split('\n',$files);

foreach my $file (@files) {
 my @parts=split('/',$file);
 my $client=$parts[3];
 print $client ."=> " . $file ."\n";
 print "-" x 78 . "\n";

my $contents="";
 open($FILE,"<",$file) || die "Cannot read file $1";
 while(<$FILE>) {
 $contents .= $_;
 }
 close($FILE);
 while ($contents =~ m/<role name="(.*?)" pattern="\.\*">(.*?)<\/role>/gsm ) {
 $role=$1;
 $perms_ids=$2;
 print $role .":";
$perms_ids=~ m/<assignedSIDs>(.*)<\/assignedSIDs>/sm;
$ids=$1;
while ($ids=~ m/<sid>(.*?)<\/sid>/gsm ) {
 print "\t". $1 ."\n";
 }
 print "\n";
}
 print "\n\n";
}

 

Jfrog artifactory speed-up foo

Here is a quick win for jfrog's artifactory behind an apache web server using mod_ajp

# Compression
######################################################################
  SetOutputFilter DEFLATE
  AddOutputFilterByType DEFLATE text/html text/plain text/xml text/x-js text/javascript text/css
  AddOutputFilterByType DEFLATE application/xml application/xhtml+xml application/x-javascript application/javascript
  AddOutputFilterByType DEFLATE application/json


  BrowserMatch ^Mozilla/4 gzip-only-text/html
  BrowserMatch ^Mozilla/4\.0[678] no-gzip
  BrowserMatch \bMSIE !no-gzip !gzip-only-text/html



  # Don't compress images and binary artifacts
  SetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png)$ no-gzip dont-vary
  SetEnvIfNoCase Request_URI \.(?:exe|t?gz|zip|bz2|sit|rar|jar)$ no-gzip dont-vary
  SetEnvIfNoCase Request_URI \.pdf$ no-gzip dont-vary

  # Enable only to verify operation
  #DeflateFilterNote ratio
  #LogFormat '"%r" %b (%{ratio}n) "%{User-agent}i"' deflate
  #CustomLog /var/log/httpd/deflate_log deflate
  ######################################################################

# Caching Override
# Set up 2 Hour caching on commonly updated fil
#################################################################################### ExpiresActive On ExpiresDefault A9200 Header append Cache-Control "proxy-revalidate" ####################################################################################

Swiss Dormant Accounts Scraper

It has been quite a while since I have used PERL and this dormant accounts’ web page seemed like a challenge. Since I am always up for a good challenge, here is the solution in good old fashioned PERL.

 


#!/usr/bin/perl 
#
# Get Rich, Dump stuff from Swiss banks angelos karageorgiou angelos@unix.gr
# 
use HTML::Form;
use WWW::Mechanize;
use HTML::Parser ();
use Data::Dumper;
use HTML::TableExtract;

my $mech = WWW::Mechanize->new();
my $url='https://www.dormantaccounts.ch/narilo/';

# first into the page, click on Publications
$mech->get( $url );
$mech->form_number(2);
$mech->click();

my $html = $mech->content();
dump_table($html);

my $cont=1;
while ($cont) {
    print "-" x 80 ."\n";
    $cont=0;
    my $form=$mech->form_number(2);
    my $saved_input=undef;
    foreach $input ($form->inputs) {
        if ($input->value eq 'Next') {
            $saved_input=$input;
            $cont=1;
        }
    }
    # just in case
    $mech->click_button( input => $saved_input);
    dump_table($mech->content());
}

sub dump_table () {
 my $html=shift;
 $te = HTML::TableExtract->new( );
 $te->parse($html);

 # Examine all matching tables
 foreach $ts ($te->tables) {
   next if $ts->coords == "0,0";
   foreach $row ($ts->rows) {
        foreach $col (@$row) {
            $col=~s/[\s][\s]*/ /g;
            print "'$col' ;";
        }
    print "\n";
   }
 }
}