Hoi! It’s me, Yağmur Akarken

Hoi! It’s me, Yağmur Akarken
A photo of me in front of the famous canals in Amsterdam

Hoi from Amsterdam, I’m Yağmur. I’m 22 years old. I am a fourth-year computer engineering and mathematics student at Koç University in Turkey. 

Before I get into the details of the project I will be working on this summer, I would like to talk about a few things about myself:

I am passionate about learning. I don’t mean just computer science. Gaining and learning new experiences in all areas of my life is what interests me the most. That’s why I always try to take up different hobbies and work in different fields related to computer engineering. Until Prace SoHPC, I worked mostly on deep learning in computer science and a few of them are as follows:

  • With the TDD project, we collect and document many resources from the Turkish NLP field, from datasets to natural language processing tools, and present them to everyone working in this field as an open source.
  • Another project I’m working on is Stylegan4Video. With this project, we are working on generating long-term videos using styleGAN models.

After being so interested in deep learning, I realized that most of the models we use have actually improved so much with the improvements in the hardware and the optimizations made in the codes. That’s why, when the operating systems class’ professor sent the e-mail about SoHPC, I applied to the program directly.

The Application Process

While making my application, I applied to a few projects that interested me and that I met the prerequisites. Maybe I wasn’t too excited at the time, but I was definitely excited while waiting for the results to be announced.

I remember exactly that moment; I woke up around 10 a.m. on May 11 and I started my day by checking my emails (as a usual computer engineer :)). And I saw the invitation mail for the program. I was selected for Project 2217. In this project, it is aimed to analyze the operation logs made on the cluster Lisa and supercomputer Snellius with machine learning methods. I was very excited that the project would be face to face and that I would come to Amsterdam for the project. Now we have left the first week behind.

First week of SoHPC

Four days of the first week were reserved for online training. During these four days, we had the opportunity to have a brief introduction to parallel programming and learned about some of the software packages used like OpenMP and MPI. On Thursday I flew to and settled in Amsterdam, so I had the opportunity to go to the Surf office on Friday and meet my mentors and people working on the project. My mentors and co-workers are really cool, caring and helpful people, they even helped me find accommodation! I also had the chance to learn detailed information about the project on Friday and I realized that I must work with a huge amount of data. 

In our first meeting, we also made a little planning and we decided that it would be good to focus on data by doing principal component analysis and understand it. For the project, we can only reach the information for a single job or a single node by using the data from various systems in a relational way. Afterwards, I will try to collect the information found for a node or a job in different systems and create a single table in the database. Then we will concentrate on anomaly detection, decide which information from the table is necessary for anomaly detection and do experiments.

What will happen next?

Even in the first meeting, many things emerged that I did not understand and that I needed to concentrate on learning. But these are the things that excite me and make me feel like I’m in a new challenge. By the end of this summer, I hope to learn more about parallel programming, working with large amounts of operational log data, high performance computing and supercomputers, and of course a trip in the Netherlands! See you in my next blog post!

Tagged with: , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.