Introduction: Trifactorization in Ljubljana

After an exciting training week in Ostrava, Paras and I had probably the most convenient travel possibility to our site because Leon brought us to Ljubljana by car. On the way, we made a small detour to listen to a concert of a youth orchestra where Leon’s daughter plays clarinet which was great. I was reveling in nostalgia because it reminded me of the times when I played in a youth orchestra. Also, the scenery was simply amazing with the mountain ridge in the background of the stage. Late in the evening we arrived at our destination and we moved into our beautiful dorm room in Dom Podiplomcev where we even have a balcony!

On Sunday, still tired from the training week we thought that we could finally catch up some sleep. According to the plan, our first day at our new working place would start at 9 am. That is actually late enough to sleep in. But this thought was immediately interrupted when Leon called us right before we set the alarm clock for the next day. He announced that the next day, early in the morning we would travel to the Italian border to visit the largest supercomputer of Slovenia called Arctur. The good news definitely predominated the bad news. It was very interesting to see how they set up the processors in a literally circular manner around the air cooling system. Unfortunately, we were not allowed to take pictures.

In the afternoon we were introduced to the other students and employees who work in the same laboratory. The working atmosphere is pleasant and we had lots of discussions (not only) about HPC which is nice. The job is very demanding and therefore they offer coffee for free. Since the canteen of the faculty of mechanical engineering is (for certain reasons) closed during the summer, we usually go to other canteens or student restaurants for lunch.

At the dragon bridge

Ljubljana is a very modern city which offers a great hospitality and quality of life. You can find parks everywhere and even in the city centre it is very green. In the first week, Paras and I registered for the public bike sharing system and since, every morning we go to university by bike. On the weekend, my girlfriend Alexandra visited me and so, the three of us went to Ljubljana’s castle which is a key landmark of the town. Inside the castle there’s a museum exhibition on Slovenian history.

Paras, me and Alexandra at the castle

Now, I’d like to say a few words about the project itself. My mentor Prof. Povh received a request from the area of biomedicine where they try to investigate the relation between different biological objects containing various information. One can comprehend these objects as networks. To mine (“extract”) all similarities between these networks we use the so called matrix trifactorization which can basically be reformulated as an optimization problem:

min (\sum_i R_i - G S_i G^T) s.t. S_i \in \mathbf{R}^{k\ x\ k}, S_i = S_i^{T}

A typical approach to solve this optimization problem is to derive the Karush-Kuhn-Tucker conditions and apply iterative methods on the first order condition. Initially, we chose the fixpoint method and the projected gradient method as such iterative methods. During the first week, I have implemented the C++ code while I started using the supercomputer during the second week in order to improve the code further and to use the Intel-MKL library.

Currently, I am testing the code on real data. To show you some results, you can see a figure of convergency in the following figure. We consider RSE as a quality measure of the factorization and observe it throughout the iteration process. It is defined as:

RSE = \frac{\sum_i \vert\vert R_i - G S_i G^T\vert\vert^2}{\sum_i \vert\vert R_i\vert\vert^2}

Keep in mind, that the RSE should be decreasing and reach a small value, preferably as close to zero as possible at the end of our iteration process. But this is only possible if we choose a large enough inner dimension k < n, where n is the size of the square matrices R_i.

So, let’s take some random sparse matrices R_i \in \mathbf{R}^{10\ x\ 10} into account which we want to decompose. Suppose, k = 7. Then we achieve the following result after 1000 iterations for also randomly chosen starting point matrices G and S.

Trend of RSE for 1000 iterations


In my next blogpost, I will go into more detail about the results. Also, I will compare both the fixpoint method and the projected gradient method on real data.


Please follow and like us:
Posted in Blogs 2017, PRACE Summer of HPC 2017 Tagged with: , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *