Você está na página 1de 6

Concurrency concepts

Lock Striping in Java Concurrency


In case you have only one lock for the whole data structure like Array or Map and you are synchronizing it and using it
in a multi-threaded environment. Then, effectively any given time only one thread can manipulate the map, as there
is only a single lock. All the other threads would be waiting to get the monitor.

If you have to visualize a HashTable or a synchronized HashMap it can be depicted like the following image.

If there are 6 threads only one can get the lock and enter the synchronized collection. Even if the keys these threads
want to get (or manipulate the values) are in different buckets as in the image if two threads want to access keys in
bucket 0, two threads want to access keys in bucket 1 and two threads want to access keys in bucket n-3, any given
time only one thread will get access.

Single lock for the whole collection

As expected this seriously degrades the performance, now think if there can be separate locks for separate buckets.
Then the thread contention won't be at the whole data structure level but at the bucket level. That's the concept
of lock striping. Having separate locks for a portion of a data structure where each lock is locking on a variable sized
set of independent objects.

That's how ConcurrentHashMap in Java provides synchronization. By default ConcurrentHashMap has 16 buckets and
each bucket has its own lock so there are 16 locks too. So the threads which are accessing keys in separate buckets
can access them simultaneously.

If you have to visualize it then following image would give you an idea how lock striping for
a ConcurrentHashMap will look like.

Here two threads want to access keys in bucket 0 so one of them can enter, again two threads want to access keys in
bucket 1 so one of them can enter. Same with bucket n-3. So, with lock striping out of 6 3 threads can work on the
data structure.
Lock Striping in ConcurrentHashMap

Drawback of lock striping

One of the downside of lockstriping as mentioned in "Java Concurrency in practice" is that it is more difficult and
costly, if you need to lock the collection for exclusive access than with a single lock.

DeselectHideDeleteHide ExceptDelete ExceptFormatTextUndoUndo AllSaveText PiecesWeb


StylePreviewClosePrint Edit WEHelpOptions

Non-blocking algorithms in Java


In a multi-threading application if you use a lock or synchronization only one thread at any given time can get hold to
the monitor and enter the critical section, all other threads wait for the lock to get free.

Same way, if any data structure has to be used in a multi-threaded environment then it has to use some concurrent
algorithm, if that concurrent algorithm allows only one thread at any given time and block all the others then that
algorithm is a blocking algorithm. Examples - Synchronized ArrayList or HashMap, implementations of BlockingQueue
interface like ArrayBlockingQueue or LinkedBlockingQueue use that kind of lock-based algorithm thus run the risk of
blocking the threads (may be for ever).

If a thread holding the lock is waiting for some resource like I/O or delayed due to some other fault then other
waiting threads will not make any progress.

As example - If you are using an ArrayBlockingQueue, which is a bounded blocking queue, with capacity as 10. In that
case if queue is full and another thread comes to put (using put() method) a value then the thread is blocked until
some other thread takes (using take() method) a value out.

Non-blocking algorithm

To prevent the problems as sated above non-blocking algorithm based classes/data structures are introduced in Java
starting Java 5. Some of the examples are atomic operation supporting classes like AtomicInteger, AtommicLong and
Concurrent collection like ConcurrentLinkedQueue.
An algorithm is called non-blocking if it doesn't block threads in such a way that only one thread has access to the
data structure and all the other threads are waiting. Same way failure of any thread in a non-blocking algorithm
doesn't mean failure or suspension of other threads.

Compare-And-Swap

Implementation of non-blocking data structures in Java like atomic variables or ConcurrentLinkedQueue use an
atomic read-modify-write kind of instruction based on compare-and-swap.

Reference - https://en.wikipedia.org/wiki/Compare-and-swap

According to the description from "Java Concurrency in Practice" by Brian Goetz. CAS has three operands

1. A memory location M on which to operate


2. Expected old value OV
3. New value NV

CAS will match the expected old value OV to the value stored at the memory location M, if both match then only CAS
will update the memory location M to the new value NV, otherwise it does nothing. In either case, it returns the
value currently in M. The variant of Compare-and-swap called compare-and-set returns a boolean value indicating
success/failure of the operation. In Java classes like AtomicInteger compare-and-set method is provided.

When multiple threads attempt to update the same variable simultaneously using CAS, one of those threads wins
and updates the variable's value, and the rest lose. But the losers are not punished by suspension, as they could be if
they failed to acquire a lock; instead, they are told that they didn't win the race this time but can try again.

Because a thread that loses a CAS is not blocked, it can decide whether it wants to

try again,
take some other recovery action,
or
do nothing

As example - At memory location M current value stored is 5, CAS is called by one thread with expected old value as
5 and new value as 6. At the same time another thread tries to change the value at M by passing 6 as old value and 7
as new value.

In that case first thread will succeed in changing the value stored at M to 6 whereas the other thread will report
failure as matching for it will fail.

So the point is that threads are not blocked they may fail to get the desired result and may have to call CAS in a loop
to get the result, which will result in more CPU cycles but no blocking/failure of threads because one of the thread
has acquired lock and not releasing it.

DeselectHideDeleteHide ExceptDelete ExceptFormatTextUndoUndo AllSaveText PiecesWeb


StylePreviewClosePrint Edit WEHelpOptions

Busy spinning in multi-threading


Busy spinning or busy wait in a multi-threaded environment is a technique where other threads loop continuously
waiting for a thread to complete its task and signal them to start.

while(spinningFlag){
System.out.println("Waiting busy spinning");
}
// Reached here means spinningFlag is false Now thread can start

Impact of Busy Spinning on performance

Busy spinning is wasteful of CPU cycles as thread just keep running in a loop unless the condition given in the loop
satisfies. The main thing to note here is thread doesn't relinquish the CPU control as would be the case if wait(),
sleep(), yield() methods are used where the thread gives up the CPU.

Busy spinning may give some advantage in multi-core processors. If a thread relinquishes CPU, the CPU cache for the
thread where the thread state, data are stored will also be lost, if the thread resumes its operation on another CPU.
In that case it has to rebuild the cache again.

Since thread doesn't relinquish control over CPU when busy spinning so the time spent to create cache again is
saved.

But overall it is not a good strategy and should be avoided. At least for the application programs.

Example code

Let's write a Producer - Consumer Java program using busy spinning. Here Producer thread will put 5 elements in the
list, consumer thread will be busy waiting until all the 5 elements are added by the producer thread.

import java.util.ArrayList;
import java.util.List;

public class BusySpinDemo {

public static void main(String[] args) {


ProdThread pt = new ProdThread();
Thread t1 = new Thread(pt, "Producer");
// passing producer thread in consumer thread
Thread t2 = new Thread(new ConThread(pt), "Consumer");
t1.start();
t2.start();
}
}

// Producer thread
class ProdThread implements Runnable{
List<Integer> sharedListObj;
boolean flag;
ProdThread(){
System.out.println("Constructor ProdThread");
this.sharedListObj = new ArrayList<Integer>();
this.flag = true;
}
@Override
public void run() {
System.out.println(" ProdThread run");
for(int i = 0; i < 5; i++){
System.out.println("Adding to queue - " + Thread.currentThread().getName() + " " + i);
sharedListObj.add(i);
}
flag = false;
}
}

// Consumer thread
class ConThread implements Runnable{
ProdThread pt;
ConThread(ProdThread pt){
System.out.println("Constructor ConThread");
this.pt = pt;
}
@Override
public void run() {
// Busy spinning loop
while(this.pt.flag){
System.out.println("Waiting busy spinning");
}
System.out.println("Consumer starting");
for(Integer i: this.pt.sharedListObj){
System.out.println("" + i);
}
}
}

Output

Constructor ProdThread
Constructor ConThread
ProdThread run
Waiting busy spinning
Waiting busy spinning
Waiting busy spinning
Waiting busy spinning
Waiting busy spinning
Adding to queue - Producer 0
Waiting busy spinning
Waiting busy spinning
Adding to queue - Producer 1
Waiting busy spinning
Waiting busy spinning
Waiting busy spinning
Waiting busy spinning
Adding to queue - Producer 2
Waiting busy spinning
Waiting busy spinning
Adding to queue - Producer 3
Waiting busy spinning
Adding to queue - Producer 4
Waiting busy spinning
Consumer starting
0
1
2
3
4

Note that the output is curtailed here - Deleted many "Waiting busy spinning" SOPs.

Refer to see how to write Producer Consumer Java program using wait notify.
DeselectHideDeleteHide ExceptDelete ExceptFormatTextUndoUndo AllSaveText PiecesWeb
StylePreviewClosePrint Edit WEHelpOptions
Blocking methods in Java Concurrency
There are methods in Java which want to execute the task assigned without relinquishing control to other thread. In
that case they have to block the current thread in case the condition that fulfil their task is not satisfied.

A very relevant example of blocking methods, which most of you would have encountered is read() method of
the InputStream class. This method blocks until input data is available, the end of the stream is detected, or an
exception is thrown.

Then there is accept() method of the ServerSocket class. This method listens for a connection to be made to this
socket and accepts it and blocks until a connection is made.

Since this post is more about blocking methods from the Java multi-threading perspective so let's have some
example from there.

wait() method - Which will block the current thread until either another thread invokes the notify() method or the
notifyAll() method for this object, or a specified amount of time has elapsed.
sleep() method - Causes the currently executing thread to sleep (temporarily cease execution) for the specified
number of milliseconds.
join() method - Where the current thread is blocked until all the other threads finish.
BlockingQueue and BlockingDeque interfaces - Starting Java 5 with the introduction of java.util.concurrent
package blocking data structures which implements these two interfaces BlockingQueue and BlockingDeque have
been added. Some of the examples are ArrayBlockingQueue, LinkedBlockingQueue and LinkedBlockingDeque.

In these data structures put() and take() method are there -

put(E e) - Inserts the specified element into this queue, which will block if the space is full.
take() - Retrieves and removes element from the queue, waiting if necessary until an element becomes
available.

Drawback of blocking methods

Though it is essential to block threads in order to have some synchronization on the execution order or the access on
shared object but at the same time blocking threads may lead to suspension of multiple threads even waiting forever
if not handled properly posing a serious threat to scalability and performance.

As example - If a thread holding the lock is waiting for some resource like I/O or delayed due to some other fault
then other waiting threads will not make any progress.

Non-Blocking Data Structures

Java 5 has added many data structures in concurrency package that use non-blocking algorithm like atomic variables
i.e. AtomicInteger, ConcurrentLinkedQueue or reduce the probability of blocking by using techniques like lock
striping as used in ConcurrentHashMap.

Non-blocking I/O

Java NIO's non-blocking mode in which an I/O operation will never block and may transfer fewer bytes than were
requested or possibly no bytes at all from the Channel.

DeselectHideDeleteHide ExceptDelete ExceptFormatTextUndoUndo AllSaveText PiecesWeb


StylePreviewClosePrint Edit WEHelpOptions

Você também pode gostar