Speaker: Ivan Towlson
Take advantage or newer processor architecture. .NET Framework 4 has inbuilt functionality for using this new functionality. Like LINQ, but faster. Querying a large dataset. Traditional approach requires large number of considerations for treading. ParExtSamples Works for anu IEnumerable- Optimizations for other tyoes(T[], IList ) Writing a PLINQ query - .AsParallel() - ParallelEnumerable (Does the actual work) Partitioning - Operators are replicated across partitions for (ideally) complete isolation Operator Fusion - Avoids excessive merging and partitioning steps Partitioning Algorithms - Chunk - potential for a lot of locking when doing small tasks Progressively hands out more work out to each thread to separate both small and large jobs. - Range - For IList equally divide work among threads - Stripe - For IList Elements handed out round-robin to each other Less wasted effort in the case of exception - Hash - For IEnumerable Elements assined to partition based on hash code. - Custom - Partitioner Merging - Pipelined: Each result becomes available as they are completed. - Stop-and-go: For OrderBy, ToArray, ToList, etc... Higher latency and more memory usage. - Inverted: no merging needed ForAll extension method - no buffering required. Parallelism Blockers - Ordering not guaranteed (consider OrderBy) - Exceptions (Stop other threads?) - System.AggregateException - Thread affinity - Web form controls - Operations with < 1.0 speedup - Overhead can make the entire operation slower. - Side effects and mutability are serious issues Avoid shared state, side effects Task Parallel Library (TPL) - Parallel.For(o, N, i => { work(i);}); - Parallel.ForEach Regions - Parallel.Invoke(); Task Concepts - FromAync Coordination Data Structures - Thread-safe scalable collections - Task and data management - Synchronisation types SpinLock and SpinWait for very quick waiting periods.