Stdout/stderr output in bucket does not seem to be updated while task is running

I have a method/task which I've ran before, during (failed), and after the recent update. For some reason I no longer see anything in the associated bucket folder stdout and stderr log files (these files have a size of 0). New stderr and stdout files now appear once the run has completed. To add insult to injury these files are of type application/octet-stream -- which makes it a bit more difficult to view (requires download). Is this the intended behavior? Am I doing something wrong?

Thanks!

Best Answers

Answers

  • RobinKRobinK Member

    I have noticed the same thing. It is making it difficult to see if a job is stuck or if it is just taking a long time for legitimate reasons.

  • kaankaan Member, Broadie

    Would you mind sharing your workspace with [email protected], and telling us the name of the workspace as well as any submission and workflow ids involved?

  • kaankaan Member, Broadie

    In fact, sharing the workspace won't be necessary. A ticket has been filed to address these issues. Thank you very much for reporting!

  • francois_afrancois_a Member, Broadie ✭✭

    Probably redundant at this point, but I'm also seeing this.

  • Tiffany_at_BroadTiffany_at_Broad Cambridge, MAMember, Administrator, Broadie, Moderator admin
    edited August 2018

    Thank you for chiming in @francois_a and I encourage others to as well. The team is aware of the issue and working to resolve it asap. Given the latest release, the team wants to be super diligent in the testing and release process, so it will likely be patched in the next 2-3 weeks. If sooner, we will let you know. That is certainly the goal.

  • matanmatan Member

    I'd cast my vote to making this a top priority item. This is making development/use of firecloud pipelines much more cumbersome. It is virtually impossible to know if a job is stuck or is actually making progress. For example if a certain job ran out of memory it might look to the "supervising" script like it is making progress when in fact nothing is happening under the hood. This is really generating needless costs at this juncture.

  • Tiffany_at_BroadTiffany_at_Broad Cambridge, MAMember, Administrator, Broadie, Moderator admin

    You've been heard - thank you @matan . We will let you know as soon as it gets released.

  • francois_afrancois_a Member, Broadie ✭✭

    Are there any updates on restoring this functionality?

  • RuchiRuchi Member, Broadie, Dev admin
    edited September 2018

    Hey @francois_a, @matan

    This behavior was patched and manually tested before being released to production a few weeks back. The change should have ensured stdout/stderr logs are flushed every 5 mins -- as long as the tool is writing to stdout/stderr. Can you please share an example workflow where this behavior is missing?

  • francois_afrancois_a Member, Broadie ✭✭

    I'm still not seeing stdout/stderr for any task while they're running.

    Here's an example workflow ID: 38e8892f-ea8d-4a72-8809-dedae6e101b0 (submission: 1eaaefa6-41a9-4d8d-bd2e-d1dd69fb8c63).

  • francois_afrancois_a Member, Broadie ✭✭

    More specifically, I can see the logs being written to the bucket, but in the FireCloud UI I get the following error:

    {"status":404,"response":{"req":{"method":"HEAD","url":"https://fc-3b49c5ad-a193-4e03-87db-1b523bc64921.storage.googleapis.com/1eaaefa6-41a9-4d8d-bd2e-d1dd69fb8c63%2Frseqc_tin_workflow%2Ffadb6155-05fe-4926-9fff-c97627298cb3%2Fcall-rseqc_tin%2Fattempt-3%2Fstdout","headers":{"user-agent":"node-superagent/3.8.3","authorization":"Bearer ya29.c.EloeBjjeMYJ8gK5Bx9zOc_nN7IlaHJJKl5F_vDApKenviItjo_OHCRk2NOkrpukDsXID79V2Yp4uCfCYf19NhNftgaUGTO51rTlM66rCeoY0nlOOhxIdc9Dpl34"}},"header":{"x-guploader-uploadid":"AEnB2UoEpESvE6XYrTqxxwgnuLw-wr7_YuLDkia_veDyRZHQLNEULoDCFWqbMYtGvlki1-lJ9Z7PR2CK_0-UwBKOh8gRCsAlwA","content-type":"application/xml; charset=UTF-8","content-length":"326","date":"Thu, 20 Sep 2018 00:23:38 GMT","expires":"Thu, 20 Sep 2018 00:23:38 GMT","cache-control":"private, max-age=0","server":"UploadServer","alt-svc":"quic=\":443\"; ma=2592000; v=\"44,43,39,35\"","connection":"close"},"status":404}}
    
  • agraubertagraubert Member, Broadie

    It looks like, while the job is running, stdout/err are written to the bucket in files named {task-name}-std{out/err}.log but the job monitor page is just checking for std{out/err}. Since logs work fine after the job finishes, maybe the file gets copied to the expected location after completion?

  • RuchiRuchi Member, Broadie, Dev admin

    hey @agraubert -- would you mind giving the name of the workspace where that workflow belongs to? I might have missed this, but has this workspace already been shared with [email protected]? Thanks~

Sign In or Register to comment.