Hello all, I am using boost::async() to create futures from tasks in order to wrap some behavior that may or may not be sync. This behavior is implemented behind an interface and can be small and fast operations or longer running and blocking things. All return a future as a result type, so I can hide the ugly truth. For the smaller stuff I'd like to use launch::deferred to avoid threading altogether and just execute the task when the user get()s the future. boost::future<bool> ret = boost::async(boost::launch::deferred, [this] () -> bool { // do some small cpu only impl return result; }); But when i get the result: const bool result = ret.get(); ...the application crashes. Debugging this I came to this in boost/thread/future.hpp:9373 } else if (underlying_cast<int>(policy) & int(launch::deferred)) { std::terminate(); //BOOST_THREAD_FUTURE<R> ret; //return ::boost::move(ret); // return boost::detail::make_future_deferred_shared_state<Rp>( // BF( // thread_detail::decay_copy(boost::forward<F>(f)) // ) // ); } else { To me this sounds like the application will always std::terminate() when launch::deferred is chosen. Why is that? When I use launch::any it seems to work but why is there a parameter that always yields unwanted results. What is the reason behind it? Can I get launch::deferred as it is described (not threading, just executing the task sync when get() is called) somehow? Platform is x64 Windows 10, MSVC14.1. Cheers, Stephan