In this post, I take a trip down memory lane by looking back at my professional career so far.
Pre-2005 (Before University)
Even though I consider the university to be the start of my career, I have decided to include the pre-university era as well. During that time, my main focus was to do well at school: primary school and then Mendel Grammar School. Not that I had no interaction with computers; quite on the contrary. However, I would not say that I was that typical “I started dabbling with programming when I was 10” kind of a kid. Indeed, I was mainly playing computer games, which was my biggest hobby at that time. If you understand Czech, you can read more about it here. Apart from gaming, my biggest programming-related experience was with Pascal and web development. Let me briefly describe both.
Pascal was my first programming language, and I was exposed to it as it was the language that our IT teacher selected for us to learn. Even though I was in an IT-focused class, I would say that the majority of students were not very keen on learning Pascal, so the classes and tasks were rather simple and focusing just on the basics. To give you an idea, the pinnacle was a program for computing various mathematical formulas based on the provided input parameters. At that time, I was not that interested in learning more advanced topics as I had other activities that I preferred and I enjoyed web development more as it was more practical and tangible than Pascal programs with a command-line interface.
When speaking about web development, you need to take into account that it was the early 2000s. Hence, what I did was mostly writing HTML sprinkled with some CSS and JavaScript; even jQuery did not exist at that time! Anyway, what I enjoyed about it was that I could immediately see the results and those results were of practical use. I maintained one of the sites at the grammar school, and I also created the first and second websites for my father’s school at which he was the principal. I also created the website for my very first gaming clan. To have an idea about what those looked like, you can take a look here.
At the end of my studies at the grammar school, I learned the basics of PHP and MySQL, which were parts of the most popular tech stack for developing web services at that time: LAMP (only that back than, I was like “what the heck is Linux?”). I was able to use pre-configured Apache with PHP and MySQL to create simple websites that provided dynamically generated content, which I found quite amazing back then as instead of copy&pasting HTML code into multiple pages, I could generate it at one place via a function!
To conclude, before I enrolled at the university, I had some very basic experience with Pascal, and also some basic experience with HTML, CSS, JavaScript, PHP, and MySQL. I had a lot of experience with gaming, but I guess that career-wise, this was not that helpful :-).
2005/09 — 2010/08 (University: B.S. + M.S.)
The beginning of my software-engineering career. Basically, the very first year at the Faculty of Information Technology, Brno University of Technology (BUT FIT) sparked my joy of programming, even to the point that I completely stopped playing video games; even my beloved UT99 and UT2004, at that time played under the nick Zmaster, short from “Zemek master”. Looking back, the most influential lecturer was David Martinek, who taught Introduction to Programming in C. His use of Linux made me switch from Microsoft Windows to Linux, which I consider to be one of the most important choices that I have made. Anyway, during the first year, I figured that programming is something that I might consider as a career choice.
During the early years, I was influential in helping creating Fituška, a private forum for over 2 400 students at BUT FIT. Throughout the rest of my studies, I administered, moderated, and developed the forum, which was based on a custom version of phpBB (an open-source, PHP-based bulletin board software). I was not only in charge of the forum but was also its biggest contributor. The forum contained many handcrafted and student-tailored features and improvements over vanilla phpBB. At the time of my studies, it was arguably the most widely used communication channel among students. I really enjoyed this experience, and it improved my knowledge of PHP, MySQL, HTML, and CSS.
To improve my skills with C++ (taught in the second year), in 2007, I joined an open-source project called Calitko. It was an extensible software peer-to-peer framework for resource discovery and acquisition, and I worked on that project for several months. It was an amazing experience, during which I gained a lot of practice on developing real-world software under the guidance of a skilled programmer (Peter Dimov). Apart from C++, I also sharpened my Python skills, which is a language that I also learned during the early years of my studies at BUT FIT.
My bachelor’s and master’s theses were from the area of theoretical computer science, most notably focusing on formal language theory. In a nutshell, formal language theory is a branch of mathematics that formalizes languages, such as natural or programming languages, and devices that define them strictly rigorously. Both theses (Canonical Derivations in Programmed Grammars and On Erasing Rules in Regulated Grammars) were written under the supervision of prof. Alexander Meduna, whose lectures in the Formal Languages and Compilers class brought me into the magical world of theoretical computer science and woke my interest in research. When working on the bachelor’s thesis, I learned LaTeX, which allowed me to typeset in a professional way. I really enjoyed this mathematically oriented part of computer science, and after obtaining the master’s degree, I decided to continue in the Ph.D. program.
Overall, I am really glad that I studied at BUT FIT as it gave me an important computer-science background and taught me the core topics underpinning modern computing. Over the course of five years, I completed a variety of school projects, and programmed in many languages, including C, C++, Python, PHP, Java, Haskell, Prolog, and shell scripting.
2010/09 — 2014/08 (University: Ph.D., AVG)
At the end of 2010, I not only started my Ph.D. studies at BUT FIT, but also started working as a software engineer at AVG. Let me first describe the studies, and then move to software engineering at AVG.
During my studies, apart from attending classes, I worked on my dissertation thesis, was a teaching assistant in several courses (such as Introduction to Programming, Formal Languages and Compilers, Principles of Programming Languages, or Graph Algorithms), led student theses, participated on multiple research grants, published research papers, competed in student competitions, visited conferences, gave talks, and generally became an integral part of BUT FIT. I highly value the time that I spent there as it allowed me to gain insight and experience in research, lecturing, and the overall functioning of a computer-science faculty. The topic of my dissertation thesis was One-Sided Random Context Grammars, which I successfully defended in September 2014. My results were rewarded by both a dean’s award as well as rector’s award for an excellent dissertation thesis and Ph.D. study results.
At the very beginning of 2011, I was approached with an offer to join AVG as a software engineer working on a joint BUT FIT + AVG grant focused on research and development of a retargetable decompiler used for platform-independent malware analysis of executable files. Essentially, our goal was to create a tool that would be able to take a Windows PE (.exe
), ELF, or Mach-O file, and reverse-engineer it into functionally equivalent source code in the C language. Apart from different file formats, the decompiler was supposed to support various architectures, such as x86 or ARM. I accepted the offer as decompilation (also known as reverse compilation) closely corresponded to my research area at that time, and I also wanted to gain more experience in programming, which I also highly enjoyed. While working on the decompiler, me and my colleagues published research papers, led various student works and, most importantly, gained a lot of experience with real-world software development. Indeed, we learned how to use an issue tracking system and wiki (Redmine), Git, build automation (CMake), continuous integration, writing complex test suites (including performance tests), etc. In terms of programming languages, we used mainly C++ (the decompiler itself), Python (scripts and test suites), and Bash (auxiliary scripts). The decompiler itself was built on top of LLVM, which is what the Clang compiler uses under the hood.
To conclude this era, during my studies, I have authored or co-authored 2 books (one of them was published by Springer in New York), 2 book chapters, 16 international journal papers, 10 international conference papers, and various other material (see my publications page). I highly appreciate all the time that I spent at BUT FIT because without that, it would have been harder for me to decide whether to continue working as a researcher and lecturer or move into software engineering. After much consideration, following a successful defence of my Ph.D. thesis, I decided to leave academia and focus on software engineering by continuing working at AVG.
2014/09 — 2016/09 (AVG)
Even though I left BUT FIT after finishing my Ph.D. and continued working at AVG, I have been keeping in touch with the university via supervising student works and giving invited talks, especially in classes Principles of Programming Languages and Practical Aspects of Software Design.
At AVG, I continued working on the retargetable decompiler called RetDec, which got open-sourced later in 2017. Back then, I was part of a small development team comprising of around 4-5 members. My primary focus was on the back-end part of the decompiler, whose task was to convert the intermediate code (LLVM IR) into high-level C code. In 2015, I got promoted to a Senior Developer.
Apart from developing the decompiler (C++, Python, Git, Bash, CMake), I also got two new responsibilities. The first responsibility was to manage several bare-metal servers that we bought to run our internal infrastructure (issue tracking system, integration tests, development servers, etc.). Having a lot of experience with using Linux as a user, I selected Debian as the operating system. I learned how to run Linux on servers, how to set up services that we needed (e.g. Apache, PHP, and MySQL), and how to monitor them (Icinga).
The second new responsibility was to create a public, online decompilation service, which would allow anyone to use our retargetable decompiler RetDec to decompile their files (recall that this is around 2014; RetDec got open-sourced in 2017). I hosted the decompilation service in AWS, and it featured both a web interface (a few screenshots are here) and a RESTful API. Technology-wise, I ran virtual machines via EC2 to host both the front-end and back-end parts, used ELB for load balancing, Route 53 for DNS management, S3 for file storage, self-managed RabbitMQ cluster as a message broker (SQS/SNS were unusable for us back then), LAMP (Linux + Apache + PHP + MySQL) for the front-end part, and Python + Celery for the back-end part that actually performed the decompilations. In my free time, I also created and published open-source API wrappers in various languages, namely Python, C++, Rust, and shell scripts, which allowed people to build applications on top of our decompilation service. I also created RetDec disassembly syntax-highlighting plugin for Vim. Anyway, the service got shut down when RetDec was open-sourced.
At the end of 2016, AVG was acquired by Avast, and I continued working there.
2016/10 — 2022/09 (Avast)
At Avast, my primary focus shifted from the development of RetDec to building services and tools used for (cyber) threat intelligence. I worked as an individual contributor, team lead, engineering manager, and technical lead. Next, let me describe some of the things that I worked on.
Initially, as a Senior Software Engineer II, I created an internal system for clustering of files that were coming to our back-ends by their similarity so that analysts could analyze whole groups of files instead of analyzing each of them separately. At the very beginning, the clustering was done daily via a batch job. However, this daily batch processing turned out to be insufficient, and so we switched to near-real-time clustering, in which files got clustered as soon as they reached our back-ends. The system was able to cluster 1-2 million files per day, only a few seconds or minutes after receiving them. After that, we discovered that it would be beneficial to classify the clusters as well (“this is a cluster of malicious files stealing banking credentials”), even to the degree of being able to determine the malware family/strain or their variants. Apart from this system, I also worked on other related threat-intelligence services and tools.
The primary technologies that we used were Linux, Python, C++, Git, MongoDB, PostgreSQL, Redis, and RabbitMQ. Initially, our systems ran on bare-metal servers, but we eventually moved to running them Dockerized in an internal Kubernetes cluster. In terms of development, we employed a DevOps approach (we built, ran, and monitored the systems by ourselves), used TeamCity as our continuous integration and deployment platform, and utilized Grafana and Kibana for visualization and monitoring.
At the end of 2017, I got promoted to a Lead Software Engineer, and became responsible for leading the development of the services and tools that were used for threat intelligence. Even though my job title remained the same throughout my time at Avast, I worked in a variety of roles, ranging from being the main developer, continuing to being a team lead of a small team comprised of 2-4 people, up to being an engineering manager with seven reports. After being a manager for a year or so, I realized that being a people manager was not something that energized me, and I worked with my manager to transition to a tech lead role, which was a much better fit for me. Looking back, I am grateful for having the opportunity to be a people manager as it allowed me to gain experience from the managerial viewpoint and see that being in a technical IC role suits me more. If you would like to know more about choosing a software-engineering career, check out my viewpoint on this topic.
Apart from what I have written above, let me also mention some other areas that I worked in. I continued maintaining our infrastructure (up to whooping 50 Linux servers!), including management of a large MongoDB cluster containing 10 TB of data and development PostgreSQL/MongoDB/Redis/RabbitMQ servers. Another responsibility of mine was systems design, in which I helped other people and teams with the architecture of their services. When managing a team, I also got in charge of threat-related data management, in which we managed threat-related telemetry from our client base, prepared aggregations for us and other teams, provided threat reports for PR/sales/marketing, etc. For that, we utilized an internal, company-managed Hadoop cluster, did analytics via Hive, and used Azkaban to manage our Hadoop jobs. Last, but certainly not least, I was leading an on-call team for two years, including participating in weekly on-call rotations (we used VictorOps/Splunk).
At the end of 2022, Avast merged with NortonLifeLock and formed Gen™. I have been employed there since October 2022.
2022/10 — Now (Gen™)
At Gen™, my initial job title was Senior Principal Software Engineer, which was a staff+ individual contributor at the same level as an Engineering Manager. During the early months of the Avast/NortonLifeLock integration, my role and responsibilities were similar to the ones I had at Avast.
In the middle of 2023, I got promoted to a Lead Software Engineer, which, at Gen™, is a staff+ individual-contributor position at a Senior Engineering Manager level. Archetype-wise, I am a mix of a technical lead and architect in the threat-intelligence systems (TIS) part (26 people, three teams) of the Threat Labs department in the Technology and Innovation organization (CTO), reporting to a Director of Malware Research. In the TIS part, we not only develop systems that are used for threat-intelligence purposes but also do threat-related data management and threat research. Profession-wise, we are a mix of software engineers, data analysts, and researchers.
I have been a part of a core cloudification group leading the migration of the CTO organization into Google Cloud (GCP). My role is to help people and teams migrate their workloads into GCP, advising them on which technologies to use, help preparing the CTO landing zone, writing documentation, creating Terraform templates and modules, etc. As part of this initiative, I am also responsible for leading the migration TIS services from an on-premise Kubernetes cluster or bare-metal infrastructure into GCP.
Alright, at the time I write this, it is the end of December 2023, and this section concludes my career so far. Let’s see what 2024 has in store for me. Thank you for reading ;-).
Wow, huge kudos for all the publications! In my Ph.D. studies, I am struggling to publish one tenth of them :).
Thank you, and good luck with your Ph.D. :-).