Hey OP (assuming you're the author), you might be interested in this similar experiment I did about four years ago, same dataset, same target, similar goal!
Similar sequence of investigations, but using regular Postgres rather than Timescale. With my setup I got another ~3x speedup over COPY by copying binary data directly (assuming your data is already in memory).
Wish I saw this before I started haha! I left a footnote about why I didn't try binary copy (basically someone else found its performance disappointing) but it sounds like I should give it a try.
Yeah I imagine it depends where the data is coming from and what exactly it looks like (num fields, dtypes...?). What I did was source data -> Numpy Structured Array [0] -> Postgres binary [1]. Bit of a pain getting it into the required shape, but if you follow the links the code should get you going (sorry no type hints!).
Had a read through parts 1 and 2, thank you for the engaging reads! Love how you've formatted your posts with the margin notes too. Thank you for providing the function to write numpy structured arrays to Postgres binary, I couldn't figure this out before.
https://rdrn.me/optimising-sql/
Similar sequence of investigations, but using regular Postgres rather than Timescale. With my setup I got another ~3x speedup over COPY by copying binary data directly (assuming your data is already in memory).