C++ Async & Futures Quiz

C++
0 Passed
0% acceptance

40 in-depth questions covering C++ async programming with std::async, std::future, std::promise, task-based concurrency, and custom thread pool patterns — with 16 code examples to solidify understanding.

40 Questions
~80 minutes
1

Question 1

What is std::async in C++?

A
A function that runs a callable object asynchronously and returns a future representing the result
B
A function that runs synchronously
C
A function that creates threads
D
A function that blocks execution
2

Question 2

What is the difference between std::launch::async and std::launch::deferred?

cpp
auto future1 = std::async(std::launch::async, expensive_function);
// Runs immediately in new thread

auto future2 = std::async(std::launch::deferred, expensive_function);
// Defers execution until future.get() is called

future1.get(); // May block if not ready
future2.get(); // Executes synchronously here
A
async launches function immediately in separate thread, deferred delays execution until result is requested causing synchronous execution
B
They are identical launch policies
C
deferred launches immediately
D
async defers execution
3

Question 3

What is std::future and how does it work?

cpp
#include <future>

auto future = std::async(std::launch::async, []() {
    std::this_thread::sleep_for(std::chrono::seconds(1));
    return 42;
});

// Do other work...

int result = future.get(); // Blocks until result is ready
A
A class that represents a deferred result from asynchronous operation, providing get() method to retrieve result and wait() methods for synchronization
B
A class that executes functions synchronously
C
A class that creates threads
D
A class that blocks all execution
4

Question 4

What is the difference between future.get() and future.wait()?

cpp
std::future<int> future = std::async(expensive_calculation);

// wait() - just waits, doesn't get result
future.wait(); // Blocks until computation finishes
// future is still valid, can call get() later

// get() - waits and retrieves result
int result = future.get(); // Blocks and gets result
// future is now invalid, cannot use again
A
get() waits for completion and retrieves result invalidating future, wait() only waits without retrieving result leaving future valid
B
They are identical methods
C
wait() retrieves the result
D
get() only waits without retrieving
5

Question 5

What is std::promise and how does it relate to std::future?

cpp
#include <future>

std::promise<int> promise;
std::future<int> future = promise.get_future();

void producer() {
    int result = compute_value();
    promise.set_value(result); // Fulfill promise
}

void consumer() {
    int value = future.get(); // Get fulfilled value
}
A
promise is producer that sets result or exception, future is consumer that retrieves result, connected through get_future() method
B
They are identical classes
C
promise consumes results
D
future produces results
6

Question 6

What is future.wait_for() and when would you use it?

cpp
std::future<int> future = std::async(long_running_task);

auto status = future.wait_for(std::chrono::milliseconds(100));

if (status == std::future_status::ready) {
    int result = future.get(); // Result is ready
} else if (status == std::future_status::timeout) {
    // Timeout occurred, do something else
    continue_processing();
} else {
    // Future was deferred
}
A
Method that waits for future result with timeout, returning status indicating if result is ready, timeout occurred, or operation was deferred
B
Method that waits indefinitely
C
Method that never waits
D
Method that cancels the operation
7

Question 7

What is task-based concurrency?

A
Programming model where work is organized as tasks that can be executed asynchronously, with runtime deciding thread assignment and execution timing
B
Programming with explicit threads
C
Programming that blocks execution
D
Programming without concurrency
8

Question 8

What is the difference between std::async with default launch policy and explicit std::launch::async?

cpp
// Default policy (implementation defined)
auto f1 = std::async(compute);

// Explicit async
auto f2 = std::async(std::launch::async, compute);

// Explicit deferred
auto f3 = std::async(std::launch::deferred, compute);

// Both policies
auto f4 = std::async(std::launch::async | std::launch::deferred, compute);
A
Default policy allows implementation to choose between async and deferred execution, explicit policies force specific execution behavior
B
They are identical
C
Explicit policy allows implementation choice
D
Default policy forces async execution
9

Question 9

What is future chaining or continuation?

cpp
auto future1 = std::async([]() { return 42; });

auto future2 = std::async([f = std::move(future1)]() mutable {
    int value = f.get(); // Get first result
    return value * 2;     // Process it
});

// Or using then() concept (not in standard yet)
// auto future3 = future1.then([](int x) { return x * 2; });
A
Pattern where completion of one async operation triggers another, creating chain of dependent computations that execute sequentially while allowing concurrency between chains
B
Running futures in parallel
C
Cancelling futures
D
Blocking all futures
10

Question 10

What is the purpose of std::shared_future?

cpp
#include <future>

std::promise<int> promise;
std::shared_future<int> shared_future = promise.get_future();

// Multiple consumers can access the same result
std::future<int> future1 = shared_future;
std::future<int> future2 = shared_future;

promise.set_value(42);

int result1 = future1.get(); // OK
int result2 = future2.get(); // Also OK - shared_future allows multiple gets
A
Future type that allows multiple consumers to access the same async result, unlike std::future which can only be consumed once
B
Future that cannot be shared
C
Future that blocks all access
D
Future that destroys results
11

Question 11

What is a thread pool and why use it with async programming?

cpp
class ThreadPool {
    std::vector<std::thread> workers;
    std::queue<std::function<void()>> tasks;
    std::mutex queue_mutex;
    std::condition_variable cv;
    bool stop = false;
    
public:
    template<class F> void enqueue(F&& f) {
        std::unique_lock<std::mutex> lock(queue_mutex);
        tasks.emplace(std::forward<F>(f));
        cv.notify_one();
    }
    
    ~ThreadPool() {
        stop = true;
        cv.notify_all();
        for (auto& worker : workers) worker.join();
    }
};
A
Pool of reusable threads that execute queued tasks, avoiding thread creation overhead when launching many small async operations
B
Single thread for all operations
C
Pool that creates new threads for each task
D
Pool that destroys threads immediately
12

Question 12

What is the difference between packaged_task and promise?

cpp
std::packaged_task<int()> task([]() { return 42; });
std::future<int> future = task.get_future();

std::thread t(std::move(task)); // Execute in thread
t.join();

int result = future.get(); // Get result

// vs promise:
std::promise<int> promise;
std::future<int> future2 = promise.get_future();
promise.set_value(42); // Manual fulfillment
A
packaged_task wraps callable object and provides future for its result, promise allows manual result/exception setting without wrapping callable
B
They are identical classes
C
packaged_task allows manual setting
D
promise wraps callables
13

Question 13

What is async exception handling?

cpp
auto future = std::async([]() {
    try {
        risky_operation();
        return 42;
    } catch (const std::exception& e) {
        throw; // Exception propagates to future
    }
});

try {
    int result = future.get(); // May throw
} catch (const std::exception& e) {
    handle_error(e);
}
A
Exceptions thrown in async functions are captured and rethrown when calling future.get(), enabling proper error propagation across thread boundaries
B
Exceptions in async functions are ignored
C
Exceptions prevent async execution
D
Exceptions are handled automatically
14

Question 14

What is work stealing in thread pools?

A
Optimization where idle threads steal tasks from busy threads' queues, improving load balancing and reducing contention on central task queue
B
Threads taking work from other threads without permission
C
Threads refusing to do work
D
Threads creating more work
15

Question 15

What is the difference between fire-and-forget tasks and result-returning tasks?

cpp
// Fire-and-forget: no result needed
std::async(std::launch::async, []() {
    log_message("Background task");
    // No return value
});

// Result-returning: need the result
auto future = std::async(std::launch::async, []() {
    return compute_expensive_result();
});

int result = future.get(); // Wait for result
A
Fire-and-forget tasks execute asynchronously without waiting for completion, result-returning tasks provide futures for retrieving computation results
B
They are identical task types
C
Fire-and-forget tasks return results
D
Result-returning tasks don't wait
16

Question 16

What is async cleanup and resource management?

cpp
class AsyncResource {
    std::future<void> cleanup_future;
    
public:
    AsyncResource() {
        // Start async cleanup of previous instance
        cleanup_future = std::async(std::launch::async, []() {
            cleanup_old_resources();
        });
    }
    
    ~AsyncResource() {
        if (cleanup_future.valid()) {
            cleanup_future.wait(); // Ensure cleanup completes
        }
    }
};
A
Using async operations for resource cleanup while ensuring proper synchronization in destructors to prevent resource leaks during shutdown
B
Ignoring resource cleanup
C
Blocking all cleanup operations
D
Creating resource leaks
17

Question 17

What is the difference between synchronous and asynchronous programming models?

A
Synchronous blocks execution until operation completes, asynchronous allows continuation of execution while operation runs in background with result retrieval through futures
B
They are identical programming models
C
Asynchronous blocks execution
D
Synchronous allows background execution
18

Question 18

What is future unwrapping or nested futures?

cpp
auto nested_future = std::async([]() {
    // Return another future
    return std::async(std::launch::async, []() {
        return 42;
    });
});

// Without unwrapping: future<future<int>>
// With unwrapping (conceptual): future<int>

// Manual unwrapping:
auto outer_future = nested_future;
auto inner_future = outer_future.get(); // Get the inner future
int result = inner_future.get(); // Get the final result
A
Situation where async function returns another future, requiring manual unwrapping to access final result through chained get() calls
B
Futures that cannot be accessed
C
Futures that destroy results
D
Futures that block indefinitely
19

Question 19

What is async cancellation and how is it implemented?

A
Cooperative cancellation where async operations periodically check cancellation flags, allowing graceful termination rather than forceful stopping
B
Forceful termination of async operations
C
Prevention of async operation cancellation
D
Automatic cancellation without checking
20

Question 20

What is the difference between std::async and std::thread for async programming?

cpp
auto future = std::async(std::launch::async, func, arg1, arg2);
// Higher-level: manages thread, provides future, exception handling

std::thread t(func, arg1, arg2);
t.join(); // Lower-level: manual thread management, no built-in result handling
A
async provides higher-level abstraction with automatic thread management and result retrieval through futures, thread requires manual lifecycle management
B
They are identical for async programming
C
thread provides higher-level abstraction
D
async requires manual thread management
21

Question 21

What is async composition or combining multiple futures?

cpp
auto f1 = std::async([]() { return 10; });
auto f2 = std::async([]() { return 20; });

// Manual composition
auto combined = std::async([f1 = std::move(f1), f2 = std::move(f2)]() mutable {
    return f1.get() + f2.get(); // Wait for both and combine
});

int result = combined.get(); // 30
A
Pattern of combining multiple async operations into single operation that waits for all dependencies and produces combined result
B
Running futures sequentially
C
Cancelling futures
D
Ignoring future results
22

Question 22

What is the purpose of std::future_status?

A
Enumeration returned by future wait functions indicating whether async operation completed, timed out, or was deferred
B
Status of thread execution
C
Status of program termination
D
Status of memory allocation
23

Question 23

What is async deadlock and how to avoid it?

cpp
// Potential deadlock
std::mutex mtx;

auto future = std::async(std::launch::async, [&]() {
    std::lock_guard<std::mutex> lock(mtx);
    return compute();
});

// In same thread:
std::lock_guard<std::mutex> lock(mtx); // Deadlock!
int result = future.get(); // Waits for async function that needs same lock
A
Deadlock occurring when async operation and waiting code compete for same resources, avoided by ensuring async operations don't depend on locks held by waiting thread
B
Deadlock that cannot be avoided
C
Deadlock caused by too many threads
D
Deadlock that is automatically resolved
24

Question 24

What is the difference between eager and lazy async execution?

A
Eager execution starts immediately when async is called, lazy execution defers until result is requested causing synchronous execution at get() time
B
They are identical execution strategies
C
Lazy execution starts immediately
D
Eager execution defers until requested
25

Question 25

What is async result caching or memoization?

cpp
std::map<int, std::future<int>> cache;

std::future<int> get_cached_result(int key) {
    auto it = cache.find(key);
    if (it != cache.end()) {
        return it->second; // Return cached future
    }
    
    // Start async computation
    auto future = std::async(std::launch::async, [key]() {
        return expensive_compute(key);
    });
    
    cache[key] = future; // Cache the future
    return future;
}
A
Caching async operation results to avoid redundant computations, where multiple requests for same operation share single future
B
Ignoring cached results
C
Computing results multiple times
D
Destroying cached results
26

Question 26

What is the difference between promise.set_value() and promise.set_exception()?

cpp
std::promise<int> promise;
std::future<int> future = promise.get_future();

// Normal completion
promise.set_value(42);

// Exception occurred
try {
    risky_operation();
    promise.set_value(42);
} catch (const std::exception& e) {
    promise.set_exception(std::current_exception()); // Store exception
}

// Consumer sees the exception
try {
    int result = future.get(); // Throws stored exception
} catch (const std::exception& e) {
    handle_error(e);
}
A
set_value() fulfills promise with normal result, set_exception() fulfills promise with exception that will be rethrown at future.get()
B
They are identical methods
C
set_exception() sets normal values
D
set_value() sets exceptions
27

Question 27

What is async fan-out and fan-in pattern?

cpp
// Fan-out: start multiple async operations
auto f1 = std::async([]() { return process_data(data1); });
auto f2 = std::async([]() { return process_data(data2); });
auto f3 = std::async([]() { return process_data(data3); });

// Fan-in: combine results
auto combined = std::async([f1 = std::move(f1), f2 = std::move(f2), f3 = std::move(f3)]() mutable {
    return combine_results(f1.get(), f2.get(), f3.get());
});
A
Fan-out distributes work across multiple async operations, fan-in combines results from multiple async operations into single result
B
Running operations sequentially
C
Cancelling operations
D
Ignoring operation results
28

Question 28

What is the purpose of std::launch::deferred policy?

A
Defers async execution until future.get() is called, allowing lazy evaluation and avoiding unnecessary computation if result is never requested
B
Forces immediate execution
C
Prevents execution
D
Cancels execution
29

Question 29

What is async barrier or synchronization point?

cpp
std::vector<std::future<void>> futures;

// Start multiple async operations
for (auto& data : dataset) {
    futures.push_back(std::async(std::launch::async, 
        [data]() { process_item(data); }));
}

// Barrier: wait for all to complete
for (auto& future : futures) {
    future.wait(); // Or future.get() if result needed
}

// All operations completed, continue with next phase
A
Synchronization point where execution waits for multiple async operations to complete before proceeding, ensuring all parallel work finishes together
B
Point where operations are cancelled
C
Point where operations start
D
Point where operations are ignored
30

Question 30

What is the difference between std::async and thread pool enqueue?

cpp
// std::async - creates thread per call (potentially)
auto f1 = std::async(func);

// Thread pool - reuses threads
thread_pool.enqueue(func); // No future returned

// Thread pool with future
std::promise<void> promise;
auto future = promise.get_future();
thread_pool.enqueue([promise = std::move(promise)]() mutable {
    func();
    promise.set_value(); // Signal completion
});
A
async may create new threads for each call, thread pool reuses fixed number of threads avoiding creation overhead but requires manual future creation
B
They are identical approaches
C
thread pool creates new threads
D
async reuses threads
31

Question 31

What is async timeout handling?

cpp
auto future = std::async(expensive_operation);

auto status = future.wait_for(std::chrono::seconds(5));

if (status == std::future_status::ready) {
    return future.get(); // Success
} else {
    // Timeout - cancel or handle differently
    throw std::runtime_error("Operation timed out");
    // Or: return default value, or retry
}
A
Using wait_for() to limit async operation duration, allowing timeout handling and preventing indefinite blocking on slow operations
B
Making operations run indefinitely
C
Cancelling operations immediately
D
Ignoring operation duration
32

Question 32

What is the difference between shared_future and regular future?

A
shared_future can be copied and accessed by multiple consumers, regular future can only be moved and consumed once
B
They are identical future types
C
regular future can be shared
D
shared_future can only be consumed once
33

Question 33

What is async batching or request coalescing?

cpp
class BatchProcessor {
    std::vector<Request> pending_requests;
    std::mutex mutex;
    std::condition_variable cv;
    
    void process_batch() {
        std::vector<Request> batch;
        {
            std::unique_lock<std::mutex> lock(mutex);
            cv.wait(lock, [this]{ return pending_requests.size() >= BATCH_SIZE; });
            batch = std::move(pending_requests);
            pending_requests.clear();
        }
        
        // Process entire batch efficiently
        process_batch_efficiently(batch);
    }
};
A
Grouping multiple small async requests into larger batches for more efficient processing, reducing overhead of individual operations
B
Processing requests individually
C
Cancelling all requests
D
Ignoring request efficiency
34

Question 34

What is the purpose of packaged_task?

A
Wraps callable object to provide future interface for its result, enabling execution in different contexts while maintaining async result access
B
Creates synchronous tasks
C
Prevents task execution
D
Destroys task results
35

Question 35

What is async pipeline or dataflow programming?

cpp
auto stage1_future = std::async([]() { return read_data(); });

auto stage2_future = std::async([f1 = std::move(stage1_future)]() mutable {
    auto data = f1.get(); // Get stage 1 result
    return process_data(data); // Stage 2 processing
});

auto stage3_future = std::async([f2 = std::move(stage2_future)]() mutable {
    auto processed = f2.get(); // Get stage 2 result
    return save_result(processed); // Stage 3 saving
});
A
Chaining async operations where each stage depends on previous stage result, creating processing pipeline with potential parallelism between independent stages
B
Running all operations simultaneously
C
Running operations sequentially without dependencies
D
Cancelling pipeline operations
36

Question 36

What is the difference between async and deferred futures?

A
Async future starts execution immediately in separate thread, deferred future delays execution until get() causing synchronous execution
B
They are identical future types
C
Deferred future starts immediately
D
Async future delays execution
37

Question 37

What is async load balancing?

cpp
class LoadBalancer {
    std::vector<std::shared_ptr<ThreadPool>> pools;
    std::atomic<size_t> next_pool = 0;
    
    std::future<void> submit_task(std::function<void()> task) {
        size_t pool_index = next_pool.fetch_add(1) % pools.size();
        return pools[pool_index]->enqueue_with_future(task);
    }
};
A
Distributing async tasks across multiple thread pools or workers to balance load and prevent any single pool from becoming bottleneck
B
Concentrating all tasks on single pool
C
Preventing task distribution
D
Creating unbalanced loads
38

Question 38

What is the difference between future.valid() and future.wait_for()?

cpp
std::future<int> future;

if (future.valid()) {
    // Future contains async operation
    auto status = future.wait_for(std::chrono::seconds(1));
    if (status == std::future_status::ready) {
        int result = future.get();
    }
} else {
    // Future is empty/default constructed
}
A
valid() checks if future contains async operation, wait_for() waits with timeout on valid future to check completion status
B
They are identical checks
C
wait_for() checks if future is valid
D
valid() waits for completion
39

Question 39

What is async circuit breaker pattern?

cpp
class CircuitBreaker {
    std::atomic<int> failure_count = 0;
    std::atomic<bool> open = false;
    
    std::future<Result> call_async(std::function<Result()> func) {
        if (open.load()) {
            return std::async([]() { 
                return Result::failure("Circuit open"); 
            });
        }
        
        return std::async([this, func = std::move(func)]() {
            try {
                auto result = func();
                failure_count = 0; // Reset on success
                return result;
            } catch (...) {
                failure_count++;
                if (failure_count > THRESHOLD) {
                    open = true; // Open circuit
                }
                throw;
            }
        });
    }
};
A
Pattern that monitors async operation failures and opens circuit to prevent cascading failures, failing fast when service is unhealthy
B
Pattern that allows all failures to continue
C
Pattern that ignores failures
D
Pattern that creates more failures
40

Question 40

What are the fundamental principles for effective async programming in C++?

A
Choose appropriate launch policies, handle exceptions properly, avoid async deadlocks, use futures for result retrieval, consider thread pool for frequent operations, and understand execution guarantees
B
Never use async programming
C
Use async for all operations
D
Ignore async result handling

QUIZZES IN C++