python - Gsutil - Issue with downloading data from GCS buckets -


i have data in gcs buckets. have application runs gsutil command download data buckets , further processing.

before

  • i have gsutil 4.7 installed , had .boto file configured oauth refresh token, proxy host name, proxy port , project id.
  • my application worked fine , data being downloaded calling following command via python subprocess module.

    "gsutil -m cp -r gcs_path destination_path"

as pointed out above, command run within python application.

now

  • suddenly, noticed files being downloaded of lesser size observed in gcs buckets. example, if have file 49mb size, being downloaded partially/incompletely , size 0-20kb , on.
  • this happening when gsutil being called within python application.
  • moreever, stdout when application executing gsutil command shows 49mb out of 49mb downloaded. when go destination directory , check file sizes, way off.

however, when try same gsutil command outside python application i.e. give following linux command in terminal, data downloaded fully. complete 49mb out of 49mb being downloaded.

$gsutil -m cp -r gcs_path destination_path 

i not running out of disk space. upgraded gsutil 4.7 4.13 did not help.

is there may missing here? in advance!


Comments

Popular posts from this blog

powershell Start-Process exit code -1073741502 when used with Credential from a windows service environment -

twig - Using Twigbridge in a Laravel 5.1 Package -

c# - LINQ join Entities from HashSet's, Join vs Dictionary vs HashSet performance -