FutureTask can also be used as a latch.(FutureTask implements Future semantics and represents an abstract calculation of generatable results.The calculation represented by FutureTask is done through Callable, which is equivalent to a Runnable that generates results and can be in three states: Waiting to run, Running, and Completed.Execution Complete represents all possible ways to end a calculation, including normal, cancelled, and abnormal.When FutureTask enters the completion state, it stops at that state.
The behavior of Future.get depends on the status of the task.If the task is completed, the get immediately returns the result, otherwise the get will block until the task is in the completed state and either return the result or throw an exception.FutureTask calculates results from the thread that executes the calculation to the thread that gets the result, and FutureTask's specifications ensure that the transfer process can publish the result securely.
Let's see the magic of FutureTask through a growing example
In the code list below, the Computable(A, V) interface declares a function Computable with input type A and output type V.Computable implemented in ExpensiveFunction takes a long time to compute results. We will create a Computable wrapper to help remember the previous calculation and encapsulate the caching process
Code [Initialize Cache]:
public interface Computable<A, V> { V compute(A arg) throws InterruptedException; }
public class ExpensiveFunction implements Computable<String, BigInteger> { @Override public BigInteger compute(String arg) throws InterruptedException { return new BigInteger(arg); } }
public class Memoizer1<A, V> implements Computable<A, V> { private final Map<A, V> cache = new HashMap<A, V>(); private final Computable<A, V> c; public Memoizer1(Computable<A, V> c) { this.c = c; } @Override public synchronized V compute(A arg) throws InterruptedException { V result = cache.get(arg); if(result == null) { result = c.compute(arg); cache.put(arg, result); } return result; } }
We used synchronized to ensure that no two threads accessed HashMap at the same time, but this also resulted in poor concurrency for Memoizer1.
Let's replace HashMap with ConcurrentHashmap
public class Memoizer2<A, V> implements Computable<A, V> { private final Map<A, V> cache = new ConcurrentHashMap<A, V>(); private final Computable<A, V> c; public Memoizer2(Computable<A, V> c) { this.c = c; } @Override public synchronized V compute(A arg) throws InterruptedException { V result = cache.get(arg); if(result == null) { result = c.compute(arg); cache.put(arg, result); } return result; } }
The problem with Memoizer2 is that there may be multiple threads calculating the same value at the same time [which can cause this problem if this calculation takes a long time].
Now let's use FutureTask to solve the above problem
public class Memoizer3<A, V> implements Computable<A, V> { private final Map<A, Future<V>> cache = new ConcurrentHashMap<A, Future<V>>(); private final Computable<A, V> c; public Memoizer3(Computable<A, V> c) { this.c = c; } @Override public V compute(A arg) throws InterruptedException { Future<V> f = cache.get(arg); if (f == null) { Callable<V> eval = new Callable<V>() { @Override public V call() throws Exception { return c.compute(arg); } }; FutureTask<V> ft = new FutureTask<V>(eval); f = ft; cache.put(arg, ft); ft.run(); //c.compute will be called here } try{ return f.get(); // } catch (ExecutionException e) { } return null; } }
This code is still problematic because composite operations ("Add If Not") are performed on the underlying Map object, which cannot be locked to ensure atomicity.We use the atomic method putIfAbsent in Concurrent Map to avoid the vulnerability of Memoizer3.
import jdk.nashorn.internal.codegen.CompilerConstants; import java.util.Map; import java.util.concurrent.*; /** * Created by hms on 2017/4/12. */ public class Memoizer3<A, V> implements Computable<A, V> { private final Map<A, Future<V>> cache = new ConcurrentHashMap<A, Future<V>>(); private final Computable<A, V> c; public Memoizer3(Computable<A, V> c) { this.c = c; } @Override public V compute(A arg) throws InterruptedException { Future<V> f = cache.get(arg); if (f == null) { Callable<V> eval = new Callable<V>() { @Override public V call() throws Exception { return c.compute(arg); } }; FutureTask<V> ft = new FutureTask<V>(eval); f = cache.putIfAbsent(arg, ft); if(f==null) { f = ft; ft.run(); //c.compute will be called here } } try{ return f.get(); // } catch (CancellationException e) { cache.remove(arg, f); } catch (ExecutionException e) { e.printStackTrace(); } return null; } }