Parallel Programming in.NET Kevin Luty. History of Parallelism Benefits of Parallel Programming...
-
Upload
bernice-rich -
Category
Documents
-
view
219 -
download
2
Transcript of Parallel Programming in.NET Kevin Luty. History of Parallelism Benefits of Parallel Programming...
History of Parallelism Benefits of Parallel Programming and
Designs What to Consider Defining Types of Parallelism Design Patterns and Practices Tools Supporting Libraries
Agenda
Hardware 1960-1970’s see parallel hardware in
supercomputers Early 1980’s – super computer built with 64
8086/8087 microprocessors Late 1980’s – Cluster computing power Moore’s Law Amdahl’s Law Most recently – # of cores over clock speeds
History of Parallelism
Hardware 1960-1970’s sees parallel hardware in
supercomputers Early 1980’s – super computer built with 64
8086/8087 microprocessors Late 1980’s – Cluster computing power Moore’s Law Amdahl’s Law Most recently – # of cores over clock speeds
History of Parallelism
Software One processor means sequential programs Few APIs that promote/make use of parallel
programming 1990’s – No standards were created for parallel
programming By 2000 – Message Passing Interface (MPI),
POSIX threads (pthreads), Open Mulitprocessing (OpenMP)
History of Parallelism
Task Parallel Library (TPL) for .NET Hardware capabilities Easy to write, PLINQ
You can use all the cores! Timing Tools available for debugging Cost Effective
Benefits of Parallel Programming
Define: What needs to be done? What data is being modified? Current state? Cost Synchronization vs. Asynchronization
Output = What pattern is best
What To Consider
Parallelism – Programming with multiple threads, where it is expected that threads will execute at the same time on multiple processors. Its goal is to increase throughput.
Defining Types of Parallelism
Data Parallelism – When there is a lot of data, and the same operations must be performed on each piece of data
Task Parallelism – There are many different operations that can run simultaneously
Defining Types of Parallelism
Perform the same independent operation for each element Most common problem is not noticing
dependencies How to notice dependencies
Shared variables Using properties of an object
Parallel Loops
Helpful Properties of TPL .Break and .Stop CancellationToken MaxDegreeOfParallelism
Exception Handling ExceptionAggregate
Parallel Loops
Making use of Parallel Loops Makes use of unshared, local variables Multiple inputs, single output
Parallel Aggregation
Data Parallelism Also known as Fork/Join Pattern Uses System.Threading.Task namespace
TaskFactory Invoke Wait/WaitAny/WaitAll StartNew
Handling Exceptions
Parallel Tasks
Handling Exceptions Exceptions are deferred until Task is done AggregateException
CancellationTokenSource Can also cancel tasks outside of a Task
Parallel Tasks
Uses BlockingCollection<T> CompleteAdding
Most problems in this design due to starvation/blocking
Pipelines
Continuous Task adding Complete Small Tasks then Larger Tasks Binary Trees and Sorting
Dynamic Task Parallelism
.NET Performance Profiler (Red-Gate) JustTrace (Telerik) GlowCode (Electric Software) Performance Profiler (Visual Studio 2010
Ultimate) Concurrency Visualizer CPU Performance Memory Management
Tools
Task Parallel Library PLINQ (Parallel Language Integrated Query)
Easy to learn Rx (Reactive Extensions)
Supportive Libraries for .NET
http://en.wikipedia.org/wiki/Moore%27s_law Campbell, Colin, et al. Parallel Programming with
Microsoft .NET: Design Patterns for Decomposition and Coordination on Multicore Architectures. Microsoft, 2010. Print.
Data Parallelism (n.d.). In Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Data_parallelism
Hillar, Gaston C. Professional Parallel Programming with C#: Master Parallel Extensions with .NET 4. Indiana: Wiley. 2011. Print.
Rx Extensions (n.d.). In Microsoft. Retrieved from http://msdn.microsoft.com/en-us/data/gg577609.aspx.
T. G. Mattson, B. A. Sanders, and B. L. Massingill. Patterns for Parallel Programming. Addison-Wesley, 2004.
References