A more advanced app, and what I would like to see personally, is an example and architectural discussion on design patterns involving how best to handle server requests that require more time/resources that may not be appropriate for a single-threaded server (e.g. a database server.) From a high-level perspective, my current thinking on this is:
I guess you mean any protocol that does not have an async library, or a resource-intensive task such as image processing? If there is a specific task or protocol you'd like to see, please do mention it. Even if it does not fit in the chat application architecture, we can always use it as an idea for another Servertech app.
Handle fast requests in the main single-threaded boost.asio event loop (assuming they don't access resources locked by the next bullet point.) Handle longer requests by delegating to a separate thread pool that triggers a callback when done, without blocking the single-threaded event loop. The threads in the separate thread pool do "traditional locking" via mutexes, read/write locks, etc.
Are there more modern approaches/techniques?
I think this goes the right way, but I'd try to encapsulate it in a
class that exposes an async
interface. So let's say your troublesome call is `get_db_customer`,
which is a third party, sync
function that may block for a long time. I'd go for something like:
class db_client
{
// configure this with the number of threads you want
boost::asio::thread_pool pool_;
public:
customer get_customer(boost::asio::yield_context yield)
{
// A channel is a structure to communicate between coroutines.
// concurrent_channel is thread-safe
boost::asio::experimental::concurrent_channel