server - Need help using perl's IO::Handle::Sync on a 64-bit strawberry perl installation -
i'm .net developer no perl experience. have set perl script scheduled task on company's 64-bit windows server 2012. script written department in company, , department has take on handling it. strawberry perl (64-bit) 5.20.2.1-64bit installed on server. i've managed figure out how install perl, change info in program, etc, points new server.
when try run script, error when program tries read data csv , upsert database: "io::handle::sync not implemented on architecture".
io::handle installed on server, can't install io::handle::sync. think has server being 64-bit?
i don't know enough perl feel comfortable changing script use different module, , don't have enough time learn language before has set up. there can make io::handle::sync work on system? can install 32-bit strawberry perl on 64-bit server? , if so, solve problem?
here's function i'm having problems with:
# convert csv data file bulk insert format. sub convert_data_file ($$$$$$$) { ($omniture_dbh, $omniture_mappings, $feed, $file, $basename, $table, $rsid) = @_; # open csv data file. print "processing data file: $file"; open $in, "<:encoding(utf8)", $file or die "$file: $!\n"; # remember start time. $start = time; # discard byte-order mark if present. seek $in, 0, 0 if sysread $in, $_, 1 , $_ ne "\x{feff}"; # initialize csv parser. $csv = text::csv_xs->new({ binary => 1, auto_diag => 2, allow_loose_quotes => 1, allow_loose_escapes => 1, allow_whitespace => 1 }); # read header row csv data file. $header = $csv->getline($in) or die "header line missing!\n"; # csv row. $row = {}; # bind csv data hash elements header field name. $csv->bind_columns(\@{$row}{@{$header}}); # map csv fields database columns. ($column_source, $hooks) = map_csv_fields $omniture_dbh, $omniture_mappings, $feed, $header, $row; # map database columns bulk insert. $fields = map_database_columns $omniture_dbh, $table, $row, $basename, $rsid, $feed->{feed_version}, $column_source; # use ".data" extension bulk insert data file. $bulk_insert_file = "$basename.data"; # open bulk insert data file. open $out, ">:encoding(ucs-2le)", $bulk_insert_file or die "$bulk_insert_file: $!\n"; # write byte-order mark (bom). print $out "\x{feff}"; # record counter. $records = 0; # read data rows csv data file. while ($csv->getline($in)) { # call hooks necessary. $_->() foreach @{$hooks}; # create bulk insert data record mapped data values. $_ = join "|~|", map { ${$_}; } @{$fields}; # unescape hex escapes. s/\\x([a-fa-f0-9]{2})/pack "c", hex $1/eg; # strip ascii control codes (except newline/tab) , invalid ucs-2 characters. { no warnings; tr/\n\t\x{0020}-\x{d7ff}\x{e000}-\x{ffff}//cd; } # unescape backslashes, newlines , tabs. s/\\(\\|\n|\t)/$1/g; # write data record , record terminator bulk insert data file. print $out "$_|~~|\n"; # increment record counter. $records++; } # close csv data file. close $in or die "$file: $!"; # flush output buffers. $out->flush; # sync file disk. $out->sync; # close bulk insert data file. close $out or die "$bulk_insert_file: $!"; # print informational message. printf " (%d records converted (%.2f seconds)", $records, time - $start; # return variables of interest. return $column_source, $bulk_insert_file, $records; }
Comments
Post a Comment