[dba-Tech] Parallelism

Jim Lawrence accessd at shaw.ca
Wed May 19 09:54:47 CDT 2010

Hi All:

A couple of nights ago, I went to a series of lectures given by a fellow
named Tiberiu Covaci, Swedish owner of a company called Multi-Core and he is
now doing a tour as a trainer for Microsoft. (His next lecture stop will be
in New Orleans.)

His training lecture was on Parallelism. The first lecture covered the basic
concepts of computers and their new multi-core design, a design made
necessary as single core computers ran into the physical wall as of 2005. He
then went on to describe the attempts by developers to utilize these new
multi-core systems. It has been a slow process as you can imagine.

The rudimentary attempts of threading and hyper-threading were far from
successful. First a single thread will consume up to 200 cycles just to get
substantiated and its deployment still requires the central CPU and to spawn
hundreds of threads, in a pool and then there was the issues of
synchronizing the process results. Even the best developed multi-threaded
applications, like MS SQL 2005 could over-load a computer when challenged by
a heavy duty task. MS even went so far as to apply a system lock when
performing resource heavy jobs... that means all other functionality on the
server would cease... no multi-tasking, until the job was completed... even
then sometimes a process would freeze up a server. (We have all heard of
that exact situation from a number of our DBA members.)

Our lecturer had designed a couple of short programs to showed how
multi-threading was implemented and then demonstrated how these apps when
over-loaded would crash or lock out the system. He used a Quick-sort, demo
while splitting it into multiple pieces, multi-threading it and slowly
increasing the sort data showed a 30 percent gain per thread and a 70
percent gain per core (maximum gain) but it also showed virtually not
performance loss as the data increased (10 to 100 million records) until the
process locked up the CPU. We watched the system progress via the Task
Manger's CPU display.   

Lecture two; enter Parallelism. This process has being developed on the
latest versions of Java and .Net frame work (4 or greater). He demonstrated
two new core features/objects Parallel and Task. Using Parallelism requires
a bit to decompose a problem into its components. He used an example of a
recipe where a number of steps was required from initial preparation, to the
cooking and to the table. The program which ended calculating the time was
initially designed in a standard C# routine.

The app was then recoded using Parallel running tasks. Each task can run
independently, but wait for a process to be completed if required by
scheduling, monitoring a semaphore, event, state and/or wait loops. We also
covered how to handle synchronization, data sharing and how to avoid
deadlocks. By using pararllelism the meal was completed in half the time...
It showed how 6 people could do a job faster than one... now that's obvious
but maybe not so obvious with a computer application.

The most telling observation was when viewing the Task Manger's CPU display.
No matter how heavy the requirements that were imposed on the system, the
CPU demand remained very low and flat and each core showed an even
distribution of utilization.

I was totally impressed. Now I know how such super databases as Cassandra,
using Java Parallelism works. My suggestion to all those of you using MS SQL
2005 or less; "Bail out quick and get you hands on the latest MS SQL...
there will be no comparison in performance."

A couple of recommended books on the subject are:
"Patterns of Parallel Processing" and
"Concurrent Programming on Windows"

The latest .Net has all these features built in...just ready to use.


More information about the dba-Tech mailing list