Writing a streaming S3 object archiving tool which collects all old objects (in my case they are near-zero in size, but occupy 4KB blocks). In total data is 10GB daily. So I have to stream all this process to not consume such amount of RAM.
These are audit data like external system request/responses for possible investigations. This saves a lot of space. Initially written in Python, now practicing with Rust. Container images is 2.2MB small :)
These are audit data like external system request/responses for possible investigations. This saves a lot of space. Initially written in Python, now practicing with Rust. Container images is 2.2MB small :)