background preloader

Asynchronous Programming

Facebook Twitter

Quick microbenchmarks in Visual Studio with Code Snippets - Parallel Programming with .NET. Parallelism is all about performance.

Quick microbenchmarks in Visual Studio with Code Snippets - Parallel Programming with .NET

After all, in the majority of cases, introducing parallelism into code adds some level of complexity, and the primary reason we’re ok with that additional complexity is because we get great performance enhancements as a result. As such, as we develop our parallel runtimes and libraries to help others parallelize their code, we pay a lot of attention to what things cost, ensuring that we have as little overhead as possible in our libraries so that users of our libraries get the most benefit out of their parallelization.

Given that, often while coding I’ll find myself wondering what a particular operation costs.

Tasks

Nito Asynchronous Library - Download: Version 1.4. Concurrent Affairs: Data-Parallel Patterns and PLINQ. Multicore processors are now ubiquitous on mainstream desktop computers, but applications that use their full potential are still difficult to write.

Concurrent Affairs: Data-Parallel Patterns and PLINQ

Multicore parallelism is certainly feasible, however, and a number of popular applications have been retrofitted to provide a performance boost on multicore computers. Version 4 of the .NET Framework will deliver several tools that programmers can employ to make this task easier: a set of new coordination and synchronization primitives and data structures, the Task Parallel Library and Parallel LINQ (PLINQ). This article focuses on the last item in this list, PLINQ. PLINQ is an interesting tool that makes writing code that scales on multicore machines much easier—provided that your problem matches a data-parallel pattern. PLINQ is a LINQ provider, so to program against it, you use the familiar LINQ model. Threading in C# - Part 5 - Parallel Programming. Threading in C# Joseph Albahari Last updated: 2011-4-27 Translations: Chinese | Czech | Persian | Russian | Japanese Download PDF Part 5: Parallel Programming Acknowledgements Huge thanks to Stephen Toub, Jon Skeet and Mitch Wheat for their feedback — particularly Stephen Toub whose input shaped the entire threading article and the concurrency chapters in C# 4.0 in a Nutshell.

Threading in C# - Part 5 - Parallel Programming

Reporting Progress from Tasks. Parallel Programming in .NET Framework 4: Getting Started - C# Frequently Asked Questions. With this post I want to start a series devoted to the new parallel programming features in .NET Framework 4 and introduce you the Task Parallel Library (TPL).

Parallel Programming in .NET Framework 4: Getting Started - C# Frequently Asked Questions

Update. The list of posts in this series: I have to admit that I’m not an expert in multithreading or parallel computing. However, people often ask me about easy introductions and beginner’s samples for new features. And I have an enormous advantage over most newbies in this area – I can ask people who developed this library about what I’m doing wrong and what to do next. I have a simple goal this time. Now, let the journey begin. On my 3-GHz dual-core 64-bit computer with 4 GB of RAM the program takes about 18 seconds to run. Since I’m using a for loop, the Parallel.For method is the easiest way to add parallelism.

With the following parallel code: Notice how little the code changed. Now it’s time to take things one step further. The event handler for the sequential execution looks pretty simple: What happened? 1. 2. Getting Good at Parallel (with NxtGenUG) On Thursday last week I presented a session on .NET 4.0’s Task Parallel Library (TPL) for the NxtGenUG Southampton user group.

Getting Good at Parallel (with NxtGenUG)

It’s a long way for me (I live in the North West of England) but it’s always worth the trip. I had a lot of fun, enjoyed the (challenging!) Questions the audience threw at me and enjoyed the post session beers and discussion. Here’s the slides and demos from the presentation: During the session there was a veritable stack of ‘what ifs’ we pondered. The posts: .NET 4 Cancellation Framework - Parallel Programming with .NET. A very interesting addition to .NET 4 is a set of new types that specifically assist with building cancellation-aware applications and libraries.

.NET 4 Cancellation Framework - Parallel Programming with .NET

The new types enable rich scenarios for convenient and safe cancellation, and help simplify situations that used to be be difficult and error-prone and non-composable. The details of the new types are described below, but lets begin with some of the motivating principles that the new types were designed to support: When any unit of work is commenced, it should have a consistent means to support early termination in response to a cancellation request. If some unit of work controls various other moving pieces, it should be able to conveniently chain cancellation requests to them.

Blocking calls should be able to support cancellation. In many prevailing systems, cancellation has been a secondary feature that rarely gets treated in sufficient detail to enable all of the above principles in a comprehensive fashion. Previous approaches. TC Labs: TPL Dataflow. A minimal actor framework, part 2. Originally posted at 4/29/2011 Yesterday I introduced a very minimal actor “framework”, and I noted that while it was very simple, it wasn’t a very good one.

A minimal actor framework, part 2

The major problems in that implementation are: No considerations for errors No considerations for async operations The first one seems obvious, but what about the second one, how can we not consider async operations in an actor framework? Well, the answer to that is quite simple, our actor framework assumed that we were always going to execute synchronously. As it happened, that is precisely what I had in mind for this code, so I wrote this: public class Actor<TState> { public TState State { get; set; } private readonly ConcurrentQueue<Func<TState, Task>> actions = new ConcurrentQueue<Func<TState, Task>>(); private Task activeTask; public void Act(Func<TState, Task> action) { actions.Enqueue(action); lock(this) { if (activeTask !

Thoughts?