BYOC_ID: 40058 Setup logger#1217
Conversation
|
Looks okay to me. Awaiting feedback from @LisaHopcroft before merging |
LisaHopcroft
left a comment
There was a problem hiding this comment.
Have left some comments on this review for you to consider. Nothing is wrong as such, but I've made some suggestions for changes.
I appreciate that this is the first round of adding logger messages...are you intending to add them to other functions?
I'd also add them to run_sdl.R - this might be more for @lizihao-anu to consider when he's making changes to that script.
|
Thanks @LisaHopcroft for reviewing. Tobi is going to look into the changes and tag me for review when this is ready. |
|
Hi @Jennit07 I have reviewed the comments by Lisa and made the updates. These include:
|
Thanks @OluwatobiOni Ill have another look at reviewing the changes. |
There was a problem hiding this comment.
Hi @OluwatobiOni great work on this. I have one suggested change about the formatting of the logger functions.
Another larger change i am going to suggest may take some time to develop but i would agree with Lisa's comments that we could add more logs to more functions to help with debugging. An example of this can be seen in create_episode_file where we currently use CLI messages to check which stage of processing we are at. E.g.
cli::cli_alert_info("Create episode file function started at {Sys.time()}")
I think it would be good to take the opportunity during the refactoring phase/setting up logs to change each cli message into a log message. We also do not have any cli messages in each read_XX and process_XX function. It would be a good opportunity to add these into there as well to show which stage of processing we are at. This could also be put into a function like you have done with the lists.
e.g something like:
reading {year} {dataset} data from denodo
processing {year} {dataset} data
run tests on {year} {dataset} data
For the CLI messages, i think we could also put that into a function and have a group of all messages stored as a list and called at each different stage. That way if we need to update any of these they could be done in the one place.
Happy to help with this - If you have any questions, please let me know! Thanks
|
Just wanted to mention that for NSS convenience, use squash merge. Squash merge will combine all commits into one commit and it may be helpful to keep a clean commit history and easy to solve merge conflicts. |
78fdf91 to
41ef4c2
Compare
|
Hi @Jennit07, the changes have been made:
|
Thanks @OluwatobiOni I'll have another look! |
|
Hey @OluwatobiOni could you add |
Jennit07
left a comment
There was a problem hiding this comment.
Hi @OluwatobiOni Really great work! it looks really good and is going to be really helpful as we progress with the automation. A big improvement i saw is the accuracy of timestamps with logger as well which is going to be a bonus when running the files! I note in some of the read logger messages you have already thought ahead and added this to say read from denodo which i thought was really good and going to save us time! I have a few suggestions on the wording of some of the messages, using the full name for each dataset and adding logger into the description file. But overall i think this is ready to go once those changes are made! Thanks for all your hard work on this!
Tagging @lizihao-anu in case he is interested in how logger is going to work going forward
|
Hi @Jennit07 , I have made the requested changes. Thanks! |
|
Update: Thanks @OluwatobiOni for all your hard work on this. This is ready to be approved/merged into development. However this also requires #1257 to be merged into development first. This PR will fix the file paths as they are not working correctly just now. Once #1257 has merged into development we can also merge this pending confirmation from NSS. Thanks |
Jennit07
left a comment
There was a problem hiding this comment.
Approved - waiting to be merged into development until confirmation from NSS
|
Hi @LisaHopcroft could you please confirm you are happy with the set up of logger? Many thanks |
|
Thanks @Jennit07, this looks OK to me, though will need to be tested. We now have a Snowpipe that should consume these log files once they are copied to the correct S3 bucket, so I will approve this PR and we can progress to testing. |
Jennit07
left a comment
There was a problem hiding this comment.
Looks good and ready to merge
Initial draft of SLF logs
Closes #1220