Hacker News new | past | comments | ask | show | jobs | submit login

Writing a streaming S3 object archiving tool which collects all old objects (in my case they are near-zero in size, but occupy 4KB blocks). In total data is 10GB daily. So I have to stream all this process to not consume such amount of RAM.

These are audit data like external system request/responses for possible investigations. This saves a lot of space. Initially written in Python, now practicing with Rust. Container images is 2.2MB small :)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: