Working on a personal site/blog. Setting up a page for some stats from Trakt. I’m looking at ways to automate ingestion of my trakt history. Right now I have to manually download reports and run a task to load data to my database.
First most obvious way I could think of was via API, but I didn’t see any options for that in there. Tried to craft a cURL command from the reports button, but that just downloaded the page data.
Is there a means through either path I mentioned?
I’d be hoping for a reload every 24 hours if possible, but I would abide by any request to do less often to be a polite consumer of any compute resources. Hopefully no less frequently than once a week, though while in active development, I’d need to hit the API or whatever a good bit until I could verify all was well.
I have other more convoluted and complex ways to handle it in an automated or mostly automated way, but I’d rather not if possible.
For instance, using my weekly email backup dump, but I haven’t even looked at those files to know what sort of parsing and whatnot would be needed. I just know I could break my email providers ToS and use an imap library if absolutely necessary. Otherwise, I’ll have to manually download reports and maybe FTP or SCP them to my app server into a directory that a CRON job runs my task on. Still less than ideal. RSS was another idea, but that’s an even bigger mystery.
I’m at your mercy, @justin.
Thanks!