Best way to run crawlers?

Is it better to run dedicated crawlers for each individual site you wish to index? Or should one crawler be loaded up with a list of several sites (starting points)? Or somewhere in between?

I currently have everything on one server, I think that’s the only option.

I believe the question was about the crawl start option, to start with several URLs?
If you are doing it so: several URLs in the crawl start are treated like several independent crawl starts. They just get the same internal identifier to fetch the crawl options.