I need some help to populate an array made of structs in perl.
The data for the array comesfrom a .SH file with the following format :
108,Country,Location,ap17,ip_149,ssh,model,12/8/2020
The code I am using is as follows:
use strict;
use warnings;
use Class::Struct;
struct(Net_Node => [hostname => '$', dir_ip => '$', access => '$', user => '$', pass => '$']);
my $node = Net_Node->new();
my #nodes;
my $user = "hjack";
my $pass = 'butstalion';
my $line;
my #all;
my $counter=0;
open(my $fh, '<', "exaple.sh") or die "Failed to open especified file";
#system('clear');
foreach $line (<$fh>) {
#all=split(',', $line);
$node->hostname ($all[3]);
$node->dir_ip ($all[4]);
$node->access ($all[5]);
$node->user ($user);
$node->pass ($pass);
$nodes[$counter] = $node;
$counter++;
}
my $size = #nodes;
print "\n \n";
print ("array size = $size\n\n");
$counter = 0;
while ($counter < 20) {
print ($counter,"\n\n");
print ($nodes[$counter]->hostname,"\n");
print ($nodes[$counter]->dir_ip, "\n");
print ($nodes[$counter]->access, "\n");
print ($nodes[$counter]->user, "\n");
print ($nodes[$counter]->pass, "\n\n");
$counter++;
}
close($fh);
The output of this code is a populated array but only with the last element generated in the foreach loop, is there any way to populate this array with the data of the .SH file?
Thanks in advance
the data of the file is as follows
89,Country,Location,sw01,ip_10,ssh,model,12/8/2020
90,Country,Location,sw02,ip_18,ssh,model,12/8/2020
91,Country,Location,sw03,ip_26,ssh,model,12/8/2020
92,Country,Location,sw04,ip_27,ssh,model,12/8/2020
93,Country,Location,sw05,ip_28,ssh,model,12/8/2020
94,Country,Location,sw06,ip_29,ssh,model,12/8/2020
95,Country,Location,ap02,ip_13,ssh,model,12/8/2020
96,Country,Location,ap03,ip_12,ssh,model,12/8/2020
97,Country,Location,ap04,ip_20,ssh,model,12/8/2020
98,Country,Location,ap05,ip_14,ssh,model,12/8/2020
99,Country,Location,ap06,ip_15,ssh,model,12/8/2020
100,Country,Location,ap07,ip_16,ssh,model,12/8/2020
101,Country,Location,ap08,ip_17,ssh,model,12/8/2020
102,Country,Location,ap09,ip_18,ssh,model,12/8/2020
103,Country,Location,ap10,ip_19,ssh,model,12/8/2020
104,Country,Location,ap11,ip_24,ssh,model,12/8/2020
105,Country,Location,ap12,ip_25,ssh,model,12/8/2020
106,Country,Location,ap14,ip_27,ssh,model,12/8/2020
107,Country,Location,ap15,ip_37,ssh,model,12/8/2020
108,Country,Location,ap17,ip_149,ssh,model,12/8/2020
my $node = Net_Node->new();
...
foreach $line (<$fh>) {
...
$nodes[$counter] = $node;
}
creates a single Net_Node instance and overwrites its data in every iteration of the foreach loop. It sounds like you want to create a new instance for each line of the loop. So you should move your Net_Node->new() call inside the loop.
foreach $line (<$fh>) {
my $node = Net_Node->new();
...
$nodes[$counter] = $node;
}
With a simpler data structure like a native Perl hash, you could have appended a copy of the data structure to your list like
$nodes[$counter] = { %$node };
but I would be more reluctant to do that with an object, which might not even be represented internally as a hash reference.
Perhaps the code could be implemented in the following shape
Comment:
define structure of node
for readability define an array with descriptive fields of interest
read file line by line
place temporary the data of interest into a hash
create a new node on each iteration
fill the node with data from hash
store the node in an array
generate an output for nodes data
#!/usr/bin/env perl
#
# vim: ai ts=4 sw=4
use strict;
use warnings;
use feature 'say';
use Class::Struct Net_Node => [
hostname => '$',
dir_ip => '$',
access => '$',
user => '$',
pass => '$'
];
my #fields = qw/hostname dir_ip access/;
my($user,$pass) = qw/hjack bustalion/;
my #nodes;
while( <DATA> ) {
next if /^$/; # skip empty lines
chomp;
my %data;
my $node = new Net_Node;
#data{#fields} = (split(',',$_))[3..5];
$node->hostname( $data{hostname} );
$node->dir_ip( $data{dir_ip} );
$node->access( $data{access} );
$node->user( $user );
$node->pass( $pass );
push #nodes, $node;
}
say "\nTotal nodes = " . #nodes;
my $counter = 10;
for my $node ( #nodes ) {
last unless $counter--;
say "
Hostname : " . $node->hostname . "
Dir_IP : " . $node->dir_ip . "
Access : " . $node->access . "
Userid : " . $node->user . "
Passwd : " . $node->pass;
}
__DATA__
89,Country,Location,sw01,ip_10,ssh,model,12/8/2020
90,Country,Location,sw02,ip_18,ssh,model,12/8/2020
91,Country,Location,sw03,ip_26,ssh,model,12/8/2020
92,Country,Location,sw04,ip_27,ssh,model,12/8/2020
93,Country,Location,sw05,ip_28,ssh,model,12/8/2020
94,Country,Location,sw06,ip_29,ssh,model,12/8/2020
95,Country,Location,ap02,ip_13,ssh,model,12/8/2020
96,Country,Location,ap03,ip_12,ssh,model,12/8/2020
97,Country,Location,ap04,ip_20,ssh,model,12/8/2020
98,Country,Location,ap05,ip_14,ssh,model,12/8/2020
99,Country,Location,ap06,ip_15,ssh,model,12/8/2020
100,Country,Location,ap07,ip_16,ssh,model,12/8/2020
101,Country,Location,ap08,ip_17,ssh,model,12/8/2020
102,Country,Location,ap09,ip_18,ssh,model,12/8/2020
103,Country,Location,ap10,ip_19,ssh,model,12/8/2020
104,Country,Location,ap11,ip_24,ssh,model,12/8/2020
105,Country,Location,ap12,ip_25,ssh,model,12/8/2020
106,Country,Location,ap14,ip_27,ssh,model,12/8/2020
107,Country,Location,ap15,ip_37,ssh,model,12/8/2020
108,Country,Location,ap17,ip_149,ssh,model,12/8/2020
Output
Total nodes = 20
Hostname : sw01
Dir_IP : ip_10
Access : ssh
Userid : hjack
Passwd : bustalion
Hostname : sw02
Dir_IP : ip_18
Access : ssh
Userid : hjack
Passwd : bustalion
Hostname : sw03
Dir_IP : ip_26
Access : ssh
Userid : hjack
Passwd : bustalion
Hostname : sw04
Dir_IP : ip_27
Access : ssh
Userid : hjack
Passwd : bustalion
Hostname : sw05
Dir_IP : ip_28
Access : ssh
Userid : hjack
Passwd : bustalion
Hostname : sw06
Dir_IP : ip_29
Access : ssh
Userid : hjack
Passwd : bustalion
Hostname : ap02
Dir_IP : ip_13
Access : ssh
Userid : hjack
Passwd : bustalion
Hostname : ap03
Dir_IP : ip_12
Access : ssh
Userid : hjack
Passwd : bustalion
Hostname : ap04
Dir_IP : ip_20
Access : ssh
Userid : hjack
Passwd : bustalion
Hostname : ap05
Dir_IP : ip_14
Access : ssh
Userid : hjack
Passwd : bustalion
Otherwise a slightly different approach can be taken to achieve similar result without Class::Struct module
#!/usr/bin/env perl
#
# vim: ai ts=4 sw=4
use strict;
use warnings;
use feature 'say';
my #fields = qw/hostname dir_ip access/;
my($user,$pass) = qw/hjack bustalion/;
my #nodes;
while( <DATA> ) {
next if /^$/;
chomp;
my $node;
$node->#{#fields} = (split(',',$_))[3..5];
$node->#{qw/user pass/} = ($user, $pass);
push #nodes, $node;
}
say "\nTotal nodes = " . #nodes;
for my $node ( #nodes ) {
say "
Hostname : " . $node->{hostname} . "
Dir_IP : " . $node->{dir_ip} . "
Access : " . $node->{access} . "
Userid : " . $node->{user} . "
Passwd : " . $node->{pass};
}
__DATA__
89,Country,Location,sw01,ip_10,ssh,model,12/8/2020
90,Country,Location,sw02,ip_18,ssh,model,12/8/2020
91,Country,Location,sw03,ip_26,ssh,model,12/8/2020
92,Country,Location,sw04,ip_27,ssh,model,12/8/2020
93,Country,Location,sw05,ip_28,ssh,model,12/8/2020
94,Country,Location,sw06,ip_29,ssh,model,12/8/2020
95,Country,Location,ap02,ip_13,ssh,model,12/8/2020
96,Country,Location,ap03,ip_12,ssh,model,12/8/2020
97,Country,Location,ap04,ip_20,ssh,model,12/8/2020
98,Country,Location,ap05,ip_14,ssh,model,12/8/2020
99,Country,Location,ap06,ip_15,ssh,model,12/8/2020
100,Country,Location,ap07,ip_16,ssh,model,12/8/2020
101,Country,Location,ap08,ip_17,ssh,model,12/8/2020
102,Country,Location,ap09,ip_18,ssh,model,12/8/2020
103,Country,Location,ap10,ip_19,ssh,model,12/8/2020
104,Country,Location,ap11,ip_24,ssh,model,12/8/2020
105,Country,Location,ap12,ip_25,ssh,model,12/8/2020
106,Country,Location,ap14,ip_27,ssh,model,12/8/2020
107,Country,Location,ap15,ip_37,ssh,model,12/8/2020
108,Country,Location,ap17,ip_149,ssh,model,12/8/2020
I am running a crontab as described below :
* 1 * * * /var/fdp/reportingscript/an_outgoing_tps_report.pl
* 1 * * * /var/fdp/reportingscript/an_processed_rule_report.pl
* 1 * * * /var/fdp/reportingscript/sdp_incoming_traffic_tps_report.pl
* 1 * * * /var/fdp/reportingscript/en_outgoing_tps_report.pl
* 1 * * * /var/fdp/reportingscript/en_processed_rule_report.pl
* 1 * * * /var/fdp/reportingscript/rs_incoming_traffic_report.pl
* 1 * * * /var/fdp/reportingscript/an_summary_report.pl
* 1 * * * /var/fdp/reportingscript/en_summary_report.pl
* 1 * * * /var/fdp/reportingscript/user_report.pl
and getting an error : ( for all scripts the error is same)
DBI connect('dbname=scs;host=192.168.18.23;port=5432','postgres',...) failed: FATAL: sorry, too many clients already at /var/fdp/reportingscript/sdp_incoming_traffic_tps_report.pl line 38.
Moreover, if I am running the script manually one at a time, it doesn't show any error.
For your reference i am attaching the script also for which I have shown the above error:
#!/usr/bin/perl
use strict;
use FindBin;
use lib $FindBin::Bin;
use Time::Local;
use warnings;
use DBI;
use File::Basename;
use CONFIG;
use Getopt::Long;
use Data::Dumper;
my $channel;
my $circle;
my $daysbefore;
my $dbh;
my $processed;
my $discarded;
my $db_name = "scs";
my $db_vip = "192.168.18.23";
my $db_port = "5432";
my $db_user = "postgres";
my $db_password = "postgres";
#### code to redirect all console output in log file
my ( $seco_, $minu_, $hrr_, $moday_, $mont_, $years_ ) = localtime(time);
$years_ += 1900;
$mont_ += 1;
my $timestamp = sprintf( "%d%02d%02d", $years_, $mont_, $moday_ );
$timestamp .= "_" . $hrr_ . "_" . $minu_ . "_" . $seco_;
print "timestamp is $timestamp \n";
my $logfile = "/var/fdp/log/reportlog/sdp_incoming_report_$timestamp";
print "\n output files is " . $logfile . "\n";
open( STDOUT, ">", $logfile ) or die("$0:dup:$!");
open STDERR, ">&STDOUT" or die "$0: dup: $!";
my ( $sec_, $min_, $hr_, $mday_, $mon_, $year_ ) = localtime(time);
$dbh = DBI->connect( "DBI:Pg:dbname=$db_name;host=$db_vip;port=$db_port",
"$db_user", "$db_password", { 'RaiseError' => 1 } );
print "\n Dumper is " . $dbh . "\n";
my $sthcircle = $dbh->prepare("select id,name from circle");
$sthcircle->execute();
while ( my $refcircle = $sthcircle->fetchrow_hashref() ) {
print "\n dumper for circle is " . Dumper($refcircle);
my $namecircle = uc( $refcircle->{'name'} );
my $idcircle = $refcircle->{'id'};
$circle->{$namecircle} = $idcircle;
print "\n circle name : " . $namecircle . "id is " . $idcircle;
}
sub getDate {
my $daysago = shift;
$daysago = 0 unless ($daysago);
my #months = qw(Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec);
my ( $sec, $min, $hour, $mday, $mon, $year, $wday, $yday, $isdst ) = localtime( time - ( 86400 * $daysago ) );
# YYYYMMDD, e.g. 20060126
$year_ = $year + 1900;
$mday_ = $mday;
$mon_ = $mon + 1;
return sprintf( "%d-%02d-%02d", $year + 1900, $mon + 1, $mday );
}
GetOptions( "d=i" => \$daysbefore );
my $filedate = getDate($daysbefore);
print "\n filedate is $filedate \n";
my #basedir = CONFIG::getBASEDIR();
print "\n array has basedir" . Dumper(#basedir);
$mon_ = "0" . $mon_ if ( defined $mon_ && $mon_ <= 9 );
$mday_ = "0" . $mday_ if ( defined $mday_ && $mday_ <= 9 );
foreach (#basedir) {
my $both = $_;
print "\n dir is $both \n";
for ( keys %{$circle} ) {
my $path = $both;
my $circleid = $_;
print "\n circle is $circleid \n";
my $circleidvalue = $circle->{$_};
my $file_csv_path = "/opt/offline/reports/$circleid";
my %sdp_hash = ();
print "\n file is $file_csv_path csv file \n";
if ( -d "$file_csv_path" ) {
} else {
mkdir( "$file_csv_path", 0755 );
}
my $csv_new_file
= $file_csv_path
. "\/FDP_"
. $circleid
. "_SDPINCOMINGTPSREPORT_"
. $mday_ . "_"
. $mon_ . "_"
. $year_ . "\.csv";
print "\n file is $csv_new_file \n";
print "\n date:$year_-$mon_-$mday_ \n";
open( DATA, ">>", $csv_new_file );
$path = $path . $circleid . "/Reporting/EN/Sdp";
print "\n *****path is $path \n";
my #filess = glob("$path/*");
foreach my $file (#filess) {
print "\n Filedate ---------> $filedate file is $file \n";
if ( $file =~ /.*_sdp.log.$filedate-*/ ) {
print "\n found file for $circleid \n";
my $x;
my $log = $file;
my #a = split( "-", $file );
my $starttime = $a[3];
my $endtime = $starttime;
my $sdpid;
my $sdpid_value;
$starttime = "$filedate $starttime:00:00";
$endtime = "$filedate $endtime:59:59";
open( FH, "<", "$log" ) or die "cannot open < $log: $!";
while (<FH>) {
my $line = $_;
print "\n line is $line \n";
chomp($line);
$line =~ s/\s+$//;
my #a = split( ";", $line );
$sdpid = $a[4];
my $stat = $a[3];
$x->{$sdpid}->{$stat}++;
}
close(FH);
print "\n Dumper is x:" . Dumper($x) . "\n";
foreach my $sdpidvalue ( keys %{$x} ) {
print "\n sdpvalue us: $sdpidvalue \n";
if ( exists( $x->{$sdpidvalue}->{processed} ) ) {
$processed = $x->{$sdpidvalue}->{processed};
} else {
$processed = 0;
}
if ( exists( $x->{$sdpidvalue}->{discarded} ) ) {
$discarded = $x->{$sdpidvalue}->{discarded};
} else {
$discarded = 0;
}
my $sth_new1 = $dbh->prepare("select id from sdp_details where sdp_name='$sdpid' ");
print "\n sth new is " . Dumper($sth_new1);
$sth_new1->execute();
while ( my $row1 = $sth_new1->fetchrow_hashref ) {
$sdpid_value = $row1->{'id'};
print "\n in hash rowref from sdp_details table " . Dumper($sdpid_value);
}
my $sth_check
= $dbh->prepare(
"select processed,discarded from sdp_incoming_tps where circle_id='$circleidvalue' and sdp_id='$sdpid_value' and start_time='$starttime' and end_time='$endtime'"
);
print "\n Dumper for bhdatabase statement is " . Dumper($sth_check);
$sth_check->execute();
my $duplicate_row = 0;
my ( $success_, $failure_ );
while ( my $row_dup = $sth_check->fetchrow_hashref ) {
print "\n row_dup is " . Dumper($row_dup);
$duplicate_row = 1;
$success_ += $row_dup->{'processed'};
$failure_ += $row_dup->{'discarded'};
}
if ( $duplicate_row == 0 ) {
my $sth
= $dbh->prepare(
"insert into sdp_incoming_tps (id,circle_id,start_time,end_time,processed,discarded,sdp_id) select nextval('sdp_incoming_tps_id'),'$circleidvalue','$starttime','$endtime','$processed','$discarded','$sdpid_value' "
);
$sth->execute();
} else {
$success_ += $processed;
$failure_ += $discarded;
my $sth
= $dbh->prepare(
"update sdp_incoming_tps set processed=$success_,discarded=$failure_ where circle_id='$circleidvalue' and sdp_id='$sdpid_value' and start_time='$starttime' and end_time='$endtime'"
);
$sth->execute();
}
# my $file_csv_path = "/opt/offline/reports/$circleid";
# my %sdp_hash = ();
# if ( -d "$file_csv_path" ) {
# } else {
# mkdir( "$file_csv_path", 0755 );
# }
# my $csv_new_file = $file_csv_path . "\/FDP_" . $circleid . "_SDPINCOMINGTPSREPORT_". $mday_ . "_" . $mon_ . "_" . $year_ . "\.csv";
print "\n file is $csv_new_file \n";
print "\n date:$year_-$mon_-$mday_ \n";
close(DATA);
open( DATA, ">>", $csv_new_file ) or die("cant open file : $! \n");
print "\n csv new file is $csv_new_file \n";
my $sth_new2 = $dbh->prepare("select * from sdp_details");
$sth_new2->execute();
while ( my $row1 = $sth_new2->fetchrow_hashref ) {
my $sdpid = $row1->{'id'};
$sdp_hash{$sdpid} = $row1->{'sdp_name'};
}
#print "\n resultant sdp hash".Dumper(%sdp_hash);
#$mon_="0".$mon_;
print "\n timestamp being matched is $year_-$mon_-$mday_ \n";
print "\n circle id value is $circleidvalue \n";
my $sth_new
= $dbh->prepare(
"select * from sdp_incoming_tps where date_trunc('day',start_time)='$year_-$mon_-$mday_' and circle_id='$circleidvalue'"
);
$sth_new->execute();
print "\n final db line is " . Dumper($sth_new);
my $str = $sth_new->{NAME};
my #str_arr = #$str;
shift(#str_arr);
shift(#str_arr);
my #upper = map { ucfirst($_) } #str_arr;
$upper[4] = "Sdp-Name";
my $st = join( ",", #upper );
$st = $st . "\n";
$st =~ s/\_/\-/g;
#print $fh "sep=,"; print $fh "\n";
print DATA $st;
while ( my $row = $sth_new->fetchrow_hashref ) {
print "\n found matching row \n";
my $row_line
= $row->{'start_time'} . ","
. $row->{'end_time'} . ","
. $row->{'processed'} . ","
. $row->{'discarded'} . ","
. $sdp_hash{ $row->{'sdp_id'} } . "\n";
print "\n row line matched is " . $row_line . "\n";
print DATA $row_line;
}
close(DATA);
}
} else {
next;
}
}
}
}
$dbh->disconnect;
Please help, how can I avoid this error.
Thanks in adv.
The immediate problem, as indicated by the error message, is that running all of those scripts at once requires more database connections than the server will allow. If they run fine individually, then running them individually will fix that.
The underlying problem is that your crontab is wrong. * 1 * * * will run all the scripts every minute from 0100 to 0159 each day. If they take more than one minute to complete, then a new set will start before the previous set completes, requiring an additional set of database connections, which will run through the pool of available connections rather quickly.
I assume that you only need to run your daily scripts once per day, not sixty times, so change that to 5 1 * * * to run them only once, at 0105.
If there's still an issue, run each one on a different minute (which is probably a good idea anyhow):
5 1 * * * /var/fdp/reportingscript/an_outgoing_tps_report.pl
10 1 * * * /var/fdp/reportingscript/an_processed_rule_report.pl
15 1 * * * /var/fdp/reportingscript/sdp_incoming_traffic_tps_report.pl
20 1 * * * /var/fdp/reportingscript/en_outgoing_tps_report.pl
25 1 * * * /var/fdp/reportingscript/en_processed_rule_report.pl
30 1 * * * /var/fdp/reportingscript/rs_incoming_traffic_report.pl
35 1 * * * /var/fdp/reportingscript/an_summary_report.pl
40 1 * * * /var/fdp/reportingscript/en_summary_report.pl
45 1 * * * /var/fdp/reportingscript/user_report.pl