recorded_files API

Want to write your own code to work with a HDHomeRun or work with the HDHomeRun DVR? We are happy to help with concepts, APIs, best practices.
Post Reply
demonrik
Posts: 1236
Joined: Mon May 04, 2015 10:03 am
Device ID: 10736454, 1073A35A, 1075C377

recorded_files API

Post by demonrik »

with the change in recording engine to 20200521 to default recorded_files API to be DisplayGroupID=root a bunch of things that I was using it for no longer work :(

So first Q first...
Any chance you are going to document the different options for recorded_files API to https://github.com/Silicondust/documentation/wiki ?
(or even update some of the others which have changed since 2016, like POST vs GET for delete, etc ;))

Now - nothing is insurmountable..
I can get the list of series, then process each for list of episodes, then sort based on my needs.
Am just trying to decide if it's worth the effort..

The main reason I keep updating the NAS packages and the HDHR_DVRUI is because it's relatively simple.
Now this change is kinda forcing me have a database behind the scenes which defeats the purpose.. There is already a well invested one running on my NAS - the record engine.
Honestly, it would be better if you could augment the record_files API to do the following
- provide a latest recorded episodes list (ideally we can specify how many we want (within some reasonable limits)), e.g. recorded_files?DisplayGroupID=latest&count=10
- provide ability to sort the output by date, alphabetical, or unwatched (if possible), e.g. recorded_files?DisplayGroupID=root&sort=date
- am sure I could think of lots of creative uses beyond that.

As I said - it just makes more sense to have the record_engine do this than me having to create another database, update a snapshot, maintain caching and synchronizing, etc.
(and since I'm in bitching mode (sorry :().. TBH it would be good to make the record engine HTTP interface (http://192.168.0.21:59090/recorded_files.html) better and avoid me having to do any of this anyway)

But if you could at a min just provide me with a recorded_files API that worked like before that would be good enough for now.

signcarver
Expert
Posts: 8920
Joined: Wed Jan 24, 2007 1:04 am
Device ID: 131B34B7 13231F92 1070A18E 1073ED6F 15300C36

Re: recorded_files API

Post by signcarver »

The new "engine" has been around for a couple of months (on android, but only reported about a month ago when having issues with other software) so you had that long to "fix" your software :D... though I strongly agree they should not have done so since their own software was using "root" already for compatibility with older engines.

Though you probably already did, you may wish to look at viewtopic.php?f=125&t=73508 and may wish to look at what I believe nick was on TODO with UpdateID.

By the way I do like your suggestions on the engine providing such options.

demonrik
Posts: 1236
Joined: Mon May 04, 2015 10:03 am
Device ID: 10736454, 1073A35A, 1075C377

Re: recorded_files API

Post by demonrik »

You know.. I did see that, but was in the middle of some critical work thing so never clicked that it was more than Android :(

Fix is not the thing here.. is potentially a big refactor. The upcoming stuff was where we first added some housekeeping because of having to do potentially hundreds of queries each time someone clicked a tab and rather than hammer SDs servers we thought it nice to cache.

Now will have to implement for everything and the hack done for upcoming is just that, so need to decide if I propagate the hack, redesign for a proper dB backend, or just abandon.

And honestly.. am leaning very much to the latter


Edit: don't mean to come off bitchy above.. just that am really busy with the day job, and thus, disappointed that I likely don't have the time/motivation to update anytime soon. And likely wouldn't have even if SD had put this up in lights months ago.

djp952
Posts: 1209
Joined: Wed Oct 01, 2008 8:46 pm
Device ID: 131EB7F7;131ED0E0
Location: Elkridge, MD USA

Re: recorded_files API

Post by djp952 »

If you don't mind me chiming in, while it definitely hurt a bit to refactor everything to deal with this for my application, I do like the addition of the UpdateID. The old recorded_files.json was getting to be a lot of data to parse for many users, myself included, reparsing 1000+ recordings was time consuming. With the UpdateID I was able to greatly improve performance. Of course, my application already has it's own database where all this stuff was processed so "hurt a bit" is of course a relative term :)

A couple gotchas to watch out for if you refactor:

- A change in "Resume" for a recording doesn't trigger a change in the series' UpdateID. If you need to watch "Resume" you'll still need to get all of them (perhaps for now, I asked about this here in the dev forum a while back, maybe it can be adjusted)

- Be careful to watch for duplicate seriesids coming back from DisplayGroupID=root. This burned me twice -- the first one is that there is/was a bug with RECORD in that over time you may get fully duplicated entries coming back from this query. Fixed easily enough with a DISTINCT-type selection of the JSON nodes, but be advised that it can happen. The second one is a bit more nefarious, watch out for duplicate seriesid values coming back on otherwise different nodes. If you run into this, there will be no way to get the recordings for all of them since the EpisodesURL will be the same, but the mistake I made was creating a composite key that included seriesid, so *boom*. In the latter case it's only been reported twice now, and one of them was due to the user manually importing recordings into the system using a tool of some kind to generate the metadata. The generated metadata didn't include seriesid, so RECORD made them all "UNKNOWN". DisplayGroupID=root sent three entries with that same seriesid across and nuked my query :)

Good luck with whatever you opt for, if there is anything I can do to assist please let me know. I too am still very disappointed with this choice since it wasn't necessary - as has been stated DisplayGroupID=root was already there, it makes no sense to me why they would opt to break the existing API by duplicating something it already did (and still does). It's a head scratcher!

nickk
Silicondust
Posts: 15908
Joined: Tue Jan 13, 2004 9:39 am

Re: recorded_files API

Post by nickk »

Fixed the issue where Update ID wasn't updated on Resume position set... will be in the next release.

Nick

demonrik
Posts: 1236
Joined: Mon May 04, 2015 10:03 am
Device ID: 10736454, 1073A35A, 1075C377

Re: recorded_files API

Post by demonrik »

djp952 wrote:
Tue May 26, 2020 7:48 pm
If you don't mind me chiming in,
Always welcome :)
while it definitely hurt a bit to refactor everything to deal with this for my application, I do like the addition of the UpdateID. The old recorded_files.json was getting to be a lot of data to parse for many users, myself included, reparsing 1000+ recordings was time consuming. With the UpdateID I was able to greatly improve performance. Of course, my application already has it's own database where all this stuff was processed so "hurt a bit" is of course a relative term :)
That database is the key.

For DVRUI - I will do the update..
will be a significant change as am going to change the whole workflow and abandon the NAS packages and standalone install will no longer be supported.

For NAS DVR Managers
Going to just peel it way back and abandon the UI part. Installing a DB is just going to blow up the support issues so I can just streamline the whole thing by doing just an installer.
Have already got a docker instance which updates the engine everytime it's started.. maybe I can build the packages around that and leave the conf file easily found so ppl just have to edit it themselves..
I too am still very disappointed with this choice since it wasn't necessary - as has been stated DisplayGroupID=root was already there, it makes no sense to me why they would opt to break the existing API by duplicating something it already did (and still does). It's a head scratcher!
Exactly - but it's SDs API, and we'll just have to live with it :(

nickk
Silicondust
Posts: 15908
Joined: Tue Jan 13, 2004 9:39 am

Re: recorded_files API

Post by nickk »

Hi demonrik,

Can you please summarize the usage model you are using for recorded_files.

The expectation is that apps will display top-level-recorded (root) information, and when the user selects a series the app will display episode information.

You shouldn't be databasing anything - the idea of the API model is to avoid secondary databases.

Nick

demonrik
Posts: 1236
Joined: Mon May 04, 2015 10:03 am
Device ID: 10736454, 1073A35A, 1075C377

Re: recorded_files API

Post by demonrik »

nickk wrote:
Sat May 30, 2020 2:34 pm
Hi demonrik,

Can you please summarize the usage model you are using for recorded_files.

The expectation is that apps will display top-level-recorded (root) information, and when the user selects a series the app will display episode information.

You shouldn't be databasing anything - the idea of the API model is to avoid secondary databases.

Nick
Hi Nick
First consider that PHP isn't persistent. Every time browser refreshes the page to create the data I can either leave the browser just use the cached HTML, or actually fetch the data in case something changes.. So I do the latter. This means every time someone looks at the recordings page I will have to rebuild the information. To get around this on the DVRUI we created a simple caching mechanism which serializes the json to a text file on the server and then read the file instead of issuing the request to your servers.. This is to avoid situation in our upcoming page where we trawl through each rule asking for episodes - didn't want to bombard your servers.
I can use this caching in other places in DVRUI, however the initial dashboard loading is going to get slower because I have to now do the same thing for the 'recent recorded' as I need to load all the recordings, so it's get list of series with DisplayGroupId=root, remove duplicates, then issue request for list of episodes to build full DB and then sort on time. So depending on series we've a bunch of new json requests to go issue. The good news here is that it's local network traffic, so not sure I will need caching here, but am concerned as the amount of unique items in the root increases.
UpdateID might help me - I need to do some experiments to see.. I've not seen much info, just that it changes if something has changed. If it's a time stamp, that would be useful, but so far that doesn't seem to be the case.. At the moment I suspect I'll need to differentiate between series json and others, and then read the cached version AND if timeout do a json and compare to see if anything actually changed. Will have to get my head around it.

For the DVR Manager I didn't have to do anything - the record_files simply gave me everything sorted from latest to oldest, so all I needed to do was format the output HTML. The goal was to be simple and not put too complicated code in there (moved all that out to DVRUI). Now I'll have to walk the series, build the DB, sort by time.
It's not hard to implement, it's just time..
And considering most of the support issues with the NAS DVR Managers is php/web permissions and config, am just loathe to invest that time and instead strip it out and reduce the support completely to just the installer.
Honestly - would be so much easier if you could manage the conf file through the web interface you have built into the record engine, then you wouldn't need my simple UI :)

Post Reply