Is it considered normal to install Jenkins on the same server where the application is hosted?

I am looking for a best practice, solution, or just a recommendation on how the automation with Jenkins is usually organized on small and medium-sized projects.

I have a small project that consists of two applications (a client and a server). I rent a server where I host these applications. I started looking at how I could automate the process of deploying some changes to the server. So I started reading about the topic CI/CD and tools used to facilitate it. I have no tests to run. Mostly, I want to get rid of the following steps once I make some changes to one of the applications:

  1. connect to the server through ssh;
  2. cd into one of the application’s folder;
  3. stop the running process;
  4. git fetch & git merge;
  5. rebuild the application;
  6. run a new process.

The question has appeared after I noticed that most of the articles I read about CI/CD and Jenkins demonstrated examples of automation with the use of Jenkins Pipelines. And most of those examples had a pipeline with at least three stages in it: build, test, and deploy. The stage deploy performed either the process of building a docker image and deploying it to the registry or executing a bash script with an abstract name such as publish. The word deploy as a name for a stage in a Jenkins pipeline and especially examples with publishing a docker image to a registry make me think that there should be at least two servers. The first server hosts Jenkins and its pipelines, and other host applications.

In my case, I consider running a simple bash script in a pipeline or freestyle job. And whatever it will be, I want Jenkins to make it on the same server with my applications. Is it considered a common practice?

Source: Docker Questions