Threads and Java Memory Model Explained

69
THREADS AND JAVA MEMORY MODEL EXPLAINED LUIZ TESTON, WWW.FRACTA.CC

Transcript of Threads and Java Memory Model Explained

Page 1: Threads and Java Memory Model Explained

THREADS AND JAVA MEMORY MODEL EXPLAINED

LUIZ TESTON, WWW.FRACTA.CC

Page 2: Threads and Java Memory Model Explained

SOME QUOTES I HEARD IN MY CAREER

Page 3: Threads and Java Memory Model Explained

“DEAD LOCK ON 300 THREADS. CAN ANYBODY HELP ME?”

Soft real time developer

Page 4: Threads and Java Memory Model Explained

“IN PARALLEL IT IS WORSE.” // GLOBAL LOCK ON A HUGE GRAPH

Myself struggling to fix a performance issue

Page 5: Threads and Java Memory Model Explained

“LET’S NOT USE THREADS, IT ALWAYS GIVES US TROUBLE.”

Architect with 15 years of experience

Page 6: Threads and Java Memory Model Explained

“MY CODE WORKS.” // NO SYNCHRONISATION, ONLY THREADS

Lead Programmer

Page 7: Threads and Java Memory Model Explained

DOING MANY THINGS AT ONCE? A FEW THINGS YOU SHOULD KNOW…

Page 8: Threads and Java Memory Model Explained

VOCABULARY

Page 9: Threads and Java Memory Model Explained

DEFINITION ON YOUR FAVOURITE SEARCH ENGINE

Page 10: Threads and Java Memory Model Explained

DEFINITION ON YOUR FAVOURITE SEARCH ENGINE

Page 11: Threads and Java Memory Model Explained

DEFINITION ON YOUR FAVOURITE SEARCH ENGINE

Page 12: Threads and Java Memory Model Explained

DEFINITION ON YOUR FAVOURITE SEARCH ENGINE

Page 13: Threads and Java Memory Model Explained

PARALLEL != CONCURRENT

Page 14: Threads and Java Memory Model Explained

DEFINITION ON YOUR FAVOURITE SEARCH ENGINE

Page 15: Threads and Java Memory Model Explained

PARALLEL DON'T DISPUTE, CONCURRENT MAY DISPUTE.

Page 16: Threads and Java Memory Model Explained

DEFINITION ON YOUR FAVOURITE SEARCH ENGINE

X

Page 17: Threads and Java Memory Model Explained

DEFINITION ON YOUR FAVOURITE SEARCH ENGINE

Page 18: Threads and Java Memory Model Explained

DEFINITION ON YOUR FAVOURITE SEARCH ENGINE

Page 19: Threads and Java Memory Model Explained

PARALLELISM WON’T IMPROVE LATENCY.

Page 20: Threads and Java Memory Model Explained

PARALLELISM MAY IMPROVE THROUGHPUT.

Page 21: Threads and Java Memory Model Explained

JSR 133 JAVA MEMORY MODEL

Page 22: Threads and Java Memory Model Explained
Page 23: Threads and Java Memory Model Explained
Page 24: Threads and Java Memory Model Explained
Page 25: Threads and Java Memory Model Explained
Page 26: Threads and Java Memory Model Explained
Page 27: Threads and Java Memory Model Explained
Page 28: Threads and Java Memory Model Explained
Page 29: Threads and Java Memory Model Explained
Page 30: Threads and Java Memory Model Explained
Page 31: Threads and Java Memory Model Explained

RACE CONDITION THE CLASSIC SAMPLE

Page 32: Threads and Java Memory Model Explained

RACE CONDITION

▸ definition: shared resources may get used “at the same time” by different threads, resulting in a invalid state.

▸ motivation: any need of concurrent or parallel processing.

▸ how to avoid: usage of some mechanism to ensure resources are used by only one thread at a time or even share nothing.

Page 33: Threads and Java Memory Model Explained

thread 1 thread 2

VAR=0

Page 34: Threads and Java Memory Model Explained

thread 1 thread 2

VAR=0

VAR++ VAR++

Page 35: Threads and Java Memory Model Explained

thread 1 thread 2

VAR=0

VAR++ VAR++

VAR=1

Clearly not the expected result. There are code in production working with those errors for years without people realising it.

VAR WAS NOT SYNCHRONISED PROPERLY

Page 36: Threads and Java Memory Model Explained

AVOIDING OR FIXING THIS RACE CONDITION

▸ let the database deal with it (just kidding, but sadly it seems to be the standard way of doing it).

▸ correct synchronisation by using locks.

▸ usage of concurrent classes, such as AtomicLong.

▸ one counter per thread (summing them still requires synchronisation).

▸ share nothing.

▸ any other suggestion?

Page 37: Threads and Java Memory Model Explained

thread 1 thread 2

VAR=0LOCK

Page 38: Threads and Java Memory Model Explained

thread 1 thread 2

VAR=0LOCK

LOCK

VAR++ WAITING LOCK

Page 39: Threads and Java Memory Model Explained

thread 1 thread 2

VAR=0LOCK

VAR++

LOCK

VAR++

Page 40: Threads and Java Memory Model Explained

thread 1 thread 2

VAR=0LOCK

VAR++

VAR++

VAR=2

The result was as expected, but there was a penalty in the time it took to perform both operations. In order to minimise it avoid sharing in the first place.

VAR WAS PROPERLY SYNCHRONISED

Page 41: Threads and Java Memory Model Explained

READS ALSO NEEDS SYNCHRONISATION // COMMON MISTAKE IS ONLY // SYNCHRONISE WRITES.

Page 42: Threads and Java Memory Model Explained
Page 43: Threads and Java Memory Model Explained
Page 44: Threads and Java Memory Model Explained
Page 45: Threads and Java Memory Model Explained
Page 46: Threads and Java Memory Model Explained
Page 47: Threads and Java Memory Model Explained

LESSONS

▸ Synchronise properly. High level APIs are easier not to mess with. java.util.concurrent excels at that.

▸ The optimal number of threads is usually twice the number of cores: Runtime.getRuntime().availableProcessors() * 2;

▸ Measure and stress. It is not easy to see synchronisation issues, since the behaviour varies depending on machine, operation system, etc. They usually don’t show while debugging.

Page 48: Threads and Java Memory Model Explained

DEAD LOCK SOMETIMES NEVER ENDS

Page 49: Threads and Java Memory Model Explained

DEAD LOCKS

▸ what it is: threads holding and waiting each other locks.

▸ motivation: global lock leads to global contention and slow code. Use of more than one fine grained lock at the same time in more than one thread in a unpredictable way is the real problem.

▸ how to avoid: ensure same locking order or review synchronisation strategy (functional approach, atomic classes, high level APIs, concurrent collections, share nothing, etc).

Page 50: Threads and Java Memory Model Explained

thread 1 thread 2

AB

Two threads have access to resources protected by two distinct locks: A and B.

Green means available, yellow means waiting and red means locked.

Two scenarios are going to be presented: Threads acquiring the locks in the same order, and in different order.

Page 51: Threads and Java Memory Model Explained

thread 1 thread 2

B A First thread acquires lock A.

Page 52: Threads and Java Memory Model Explained

thread 1 thread 2

B A A

Second thread tries to acquire the same lock. Since it is in use, it will wait until lock A is available.

Page 53: Threads and Java Memory Model Explained

thread 1 thread 2

A A

B

Meanwhile the first thread acquires lock B. The second thread is still waiting for lock A.

Page 54: Threads and Java Memory Model Explained

thread 1 thread 2

B A AThe first thread releases lock B. The second thread is still waiting for lock A.

Page 55: Threads and Java Memory Model Explained

thread 1 thread 2

AB AThen the lock A is finally released. The second thread is finally able to use it.

Page 56: Threads and Java Memory Model Explained

thread 1 thread 2

B A It acquires lock A.

Page 57: Threads and Java Memory Model Explained

thread 1 thread 2

A

B

Then it acquires lock B.

Page 58: Threads and Java Memory Model Explained

thread 1 thread 2

B A Lock B is released.

Page 59: Threads and Java Memory Model Explained

thread 1 thread 2

AB

Then lock A is released. No synchronisation problems has happened and no locked resources where harmed in this execution. Some contention has happened, but they where temporary.

EVERYTHING WAS FINE.

Page 60: Threads and Java Memory Model Explained

thread 1 thread 2

AB NOW SOMETHING DIFFERENT

Page 61: Threads and Java Memory Model Explained

thread 1 thread 2

B AThe first thread acquires lock A.

Page 62: Threads and Java Memory Model Explained

thread 1 thread 2

A BAnd the second thread acquires lock B.

Page 63: Threads and Java Memory Model Explained

thread 1 thread 2

A B

B

The first thread tries to acquire lock B. Since it is busy, it will wait for it.

Page 64: Threads and Java Memory Model Explained

thread 1 thread 2

A B

B A

And the second thread tries to acquire lock A. Since it is busy, it will wait for it.

Page 65: Threads and Java Memory Model Explained

thread 1 thread 2

A B

B A

What did the different order of lock acquisition cause?

Keep in mind locks can be acquired internally by APIs, by using the synchronised keyword, by doing IO. It is almost impossible to keep track of all the locks in a huge application stack.

DEAD LOCK IS SET.

Page 66: Threads and Java Memory Model Explained
Page 67: Threads and Java Memory Model Explained
Page 68: Threads and Java Memory Model Explained

LESSONS

▸ If sharing data between threads, synchronise properly and measure and stress (same as before).

▸ Keep in mind some dead locks keeps latent and may happen only in unusual situations (such as unusual high peak load).

▸ The best approach is to minimise sharing data, having isolated threads working independently.

▸ There are frameworks that suits better than using threads manually. Consider those, such as Akka, Disruptor, etc.

Page 69: Threads and Java Memory Model Explained

QUESTIONS? THANKS FOR YOUR TIME!

▸ https://www.cs.umd.edu/~pugh/java/memoryModel/jsr-133-faq.html

▸ http://docs.oracle.com/javase/specs/

▸ fotos: Dani Teston