Sizes and Storage
There are 1440 minutes per day. We have 1 minute intervals. We can also make a mp4 video.
- SD 576p : 0.1MB (compressed 95%, gaussian blur) : 150MB/day: 4MB video
- SD 576p : 0.2MB: 300MB/day: 6MB video
- HD 720p: 0.3MB: 450 MB/day: 10MB video
- XHD 1200p: 0.4MB: 600 MB/day: 38 MB video
So for a long timelapse (say 100 days), we’re looking at 60TB of images. S3 glacier is $4/TB/month, with normal S3 $23/TB/month. AWS EBS (i.e. normal EC2 storage) is $80/TB/month (!). Hence we can’t store all images. Even small images (150MB/day x 100 days) is 15TB.
Storing just videos and doing ‘on-demand’ (de-compiling)? Well 100 days @ 10MB/day (720p video) = 1 TB.
Code: This would require tmv-video-compile to accept a list of video files (instead of image files). It could then create a (in-memory) list of images available. Then it runs diagonal/etc with filters and creates a list of image files required. Using code from tmv-video-decompile, it extracts each image from the video file and writes to disk. tmv-video-compile runs ffmpeg on these as normal. Deletes or caches when image is complete. This is a fair bit of work and lots of complexity, compared to just a list of files 🙁
Another option is to downsample images: delete some older ones. For example, >7 days have an interval of 1h.