Linux Applications Performance: Part I: Iterative Servers

This chapter is part of a series of articles on Linux application performance.

The iterative network server is one of the earliest programs you might have written if you ever took a course or read a book on network programming. These type of servers are not very useful except for learning network programming or in rare cases where traffic and concurrency are expected to be low. An iterative server, as its name suggests, serves one client after the other, in succession usually running in a single process. While a client is being served, if other requests arrive, they are queued by the operating system and they wait until the server is done with the current client being served after which it is ready to pick up the next client. An iterative server thus, exhibits no concurrency. It serves exactly one client at a time.

Continue reading “Linux Applications Performance: Part I: Iterative Servers”

Linux Applications Performance: Introduction

Articles in this series

  1. Part I. Iterative Servers
  2. Part II. Forking Servers
  3. Part III. Pre-forking Servers
  4. Part IV. Threaded Servers
  5. Part V. Pre-threaded Servers
  6. Part VI: poll-based server
  7. Part VII: epoll-based server

On HackerNews

There are several interesting takeaways from the HackerNews thread for this article series. Do check it out.

Web apps are the staple of consumers and enterprises. Among the many existing protocols that are used to move and make sense of bits, HTTP has an overwhelming mind share. As you encounter and learn the nuances of web application development, most of you might pay very little attention to the operating system that finally runs your applications. The separation of Dev and Ops only made this worse. But with the DevOps culture becoming common place and developers becoming responsible for running their apps in the cloud, it is a clear advantage to better understand backend operating system nitty-gritty. You don’t really have to bother with Linux and how your backend will scale if you are deploying your backend as a system for personal use or for use by a few concurrent users. If you are trying to deploy for thousands or tens of thousands of concurrent users, however, having a good understanding of how the operating system figures out in your whole stack will be incredibly useful.

Continue reading “Linux Applications Performance: Introduction”