Archive for July, 2008

Eigen Values and Eigen Vectors

July 21, 2008

When we look at the eigen vectors of a matrix , it actually gives us a vector that doesnt change the direction when the matrix is applied … (Ofcourse it is allowed to rotate 180 degrees ….) . At the same time the vector is shrunk to an extent .

If B is the matrix and v is the  eigen vector with an eigen value k.

B * v = k*v

if k < 0 then its rotated by 180 degrees.  the Eigen vector is shrunk by k.

If B is a symetric matrix then it has linearly independent eigen vectors.

Non-Linear Optimization

July 17, 2008

As I was going through the compress sensing , I realised the importance of the non-linear optimization …. I remember asking questions “where do we use this method” when my teachers taught me the conjugate Gradient method ……

I almost forgot lot of math I read , about positive definite matrices, or steepest decent methods …. I need to Brush up on lot of them … ahem ahem … Here is a good paper or a tutorial that is talking about the basics of optimization without much pain 🙂

The Paper on painless conjugate Gradient method is painless-conjugate-gradient

Compress Sensing + parallel Imaging

July 10, 2008

As I was getting to understand abt the CS and why l1 norm alone works and l2 norm doesnt work …

With my discussion with RV , he was talking abt using CS and PI together.  The idea here is to use the PI to give an initial estimate image rather than using the zero filled Density corrected image as the initial estimate.

When I went through the abstracts I found a similar (I should say same) idea as part of the abstract ..

The abstract is here  01488 🙂

Compress Sensing Algorthimically

July 9, 2008

Let m be the Final Image which we are planning to arrive at

Let phi be the transform in which the image m is sparse

let y be the undersampled k-space

FFTU be the Fourier Transform operator based on the sampling pattern

Then we want to minimize of l1 norm of phi*m

with the constraint  | FFTU*m – y| < epsilon

SO here sparsity is important because we are trying to minimize and sparsity helps …

Now why is the incoherence needed ? Looks like the incoherence doesnt give any visible artifacts ,

so we should ensure that the sampling pattern is incoherent …

Point spread function gives good estimate of coherence i.e incoherence ….

Point spread function is Fourier Transform of the filter i.e the sampling pattern we have.

Probability Density Function

July 9, 2008

Integral representation of the probability Distribution

Generating a PDF based on the radius of the unit circle

create a linspace from -1 to 1based on the no.of Points needed (1-r).^p

If its a uniform pdf then lets say its 1 every where. then sum(pdf(:)) gives row*col.

Now we want to generate a pdf such that sum(pdf(:)) = row * col * sampling rate;

use the distance from the centre to create the pdf.

The code is genpdf1

Compress Sensing

July 7, 2008

Today I spent some time in trying to understand how Compress Sensing works.

I got through the idea of using the sparsity of the transform domain and reducing the degrees of freedom and arrive at this sparse elements with a relatively small set of sampled points.

Some one said that nyquist was a pessimist as he gave the upper bound. Now we are interested in the lower bound …..

A decently good tutorial (Though I didnt understand the head or tail of UUP or RIP … Need to spend more time)

http://www.ee.duke.edu/ssp07/Tutorials/ssp07-cs-tutorial.pdf

This blog seemed to cover lot on CS …. I spent some time on the links provided …

http://igorcarron.googlepages.com/cs

At present I m reading this article called sparseMRI , where cs is used to get the Scan Time optimization on the cartesian Grid for angiogram images … sparsemri1

GPU vs CELL Processor

July 3, 2008

http://www.simbiosys.ca/blog/2008/05/03/the-fast-and-the-furious-compare-cellbe-gpu-and-fpga/

This is a good article on CPU /GPU/CELL/FPGA … Its interesting ….

I m not sure where my company is headed 🙂

 

Coalescing Memory

July 1, 2008

Coalescing as per the dictionary means uniting. One of the best things about computer science terminology is they are at most of the times based on the dictionary meaning. (Yup … as always there are exceptions ,,like “Transperent”)

In GPGPU NVIDIA

What does Coaelscing actually mean  ?  I found the following answer good

Coalescing means that a memory read by consecutive threads in a warp is combined by the hardware into several, wide memory reads. The requirement is that the threads in the warp must be reading memory in order. For example, if you have a float array called data[] and you want to read many floats starting at offset n, then thread 0 in the warp must read data[n], thread 1 must read data[n+1], and so on. Those 32-bit reads, which are issued simultaneously, are merged into several 384 bit reads in order to efficiently use the memory bus. Coalescing is a warp-level activity, not a thread-level activity.