How does HTML-based HATEOAS apply in applications which also want to expose an external API?

I recently read through Hypermedia Systems, and found its arguments incredibly compelling. The book brought a lot of clarity and structure to ideas and frustrations that have been bouncing around in my own head for quite some time, and I would like to apply the principles discussed in that book (and most likely the HTMX library as well) in my next project. However, the book and the HTMX library focus on the internal API of the system. In the real world, applications also need external APIs in order to enable interoperability with other applications. Indeed, I'm sure that is what sparked the whole "let's do everything with JavaScript and a JSON API" craze that we find ourselves in today. After all, if you have to implement a JSON API anyway, why not use that for the front-end as well? I don't want to implement all the same basic CRUD logic twice, nor do I want two entirely separate APIs that implement essentially the same logic, but merely present the data in different ways. Does one merely expect external consumers to work with the more "human-oriented" HTML? This seems like a bad idea to me--it would turn any API consumer into essentially a web scraper, and those are not exactly known for their resiliency and robustness. It occurred to me that it could be possible to have both APIs live at the same URLs, distinguishing the mode of operation by way of the HTTP Accept header. The same logic would be executed either way, but the results would be returned in a different format. Of course, the format the data is presented in isn't the only different between a normal "REST" API and a hypermedia-driven REST API, merely the most noticeable and apparent one. One difference in particular that stands out to me is that in a JSON API, a GET request to a resource just, well, gets that data. However, in a hypermedia-driven approach, we need to understand why the user is fetching the resource in order to supply the appropriate hypermedia controls. Do we intend to edit the object (and thus need to display it as an HTML form) or are we merely reading the data (in which case we would need some basic informational markup and possibly an "edit" hyperlink)? Thus, my questions are: In a well-architected application which exposes both an external API and an HTML-based internal API that abides by HATEOAS principles, do the APIs live together, or separately? If separately, what is the best way to keep logic between the two from being highly repetitive? If together, how would the issues I mentioned above typically be resolved?

Jan 31, 2025 - 01:51
 0
How does HTML-based HATEOAS apply in applications which also want to expose an external API?

I recently read through Hypermedia Systems, and found its arguments incredibly compelling. The book brought a lot of clarity and structure to ideas and frustrations that have been bouncing around in my own head for quite some time, and I would like to apply the principles discussed in that book (and most likely the HTMX library as well) in my next project.

However, the book and the HTMX library focus on the internal API of the system. In the real world, applications also need external APIs in order to enable interoperability with other applications. Indeed, I'm sure that is what sparked the whole "let's do everything with JavaScript and a JSON API" craze that we find ourselves in today. After all, if you have to implement a JSON API anyway, why not use that for the front-end as well? I don't want to implement all the same basic CRUD logic twice, nor do I want two entirely separate APIs that implement essentially the same logic, but merely present the data in different ways.

Does one merely expect external consumers to work with the more "human-oriented" HTML? This seems like a bad idea to me--it would turn any API consumer into essentially a web scraper, and those are not exactly known for their resiliency and robustness.

It occurred to me that it could be possible to have both APIs live at the same URLs, distinguishing the mode of operation by way of the HTTP Accept header. The same logic would be executed either way, but the results would be returned in a different format.

Of course, the format the data is presented in isn't the only different between a normal "REST" API and a hypermedia-driven REST API, merely the most noticeable and apparent one. One difference in particular that stands out to me is that in a JSON API, a GET request to a resource just, well, gets that data. However, in a hypermedia-driven approach, we need to understand why the user is fetching the resource in order to supply the appropriate hypermedia controls. Do we intend to edit the object (and thus need to display it as an HTML form) or are we merely reading the data (in which case we would need some basic informational markup and possibly an "edit" hyperlink)?

Thus, my questions are:

  • In a well-architected application which exposes both an external API and an HTML-based internal API that abides by HATEOAS principles, do the APIs live together, or separately?
  • If separately, what is the best way to keep logic between the two from being highly repetitive?
  • If together, how would the issues I mentioned above typically be resolved?