There's no "hard" limit on the number of files you can download, or put in the download queue. However, there's a soft limit, raised by common sense..
And available memory, as you pointed out. There are definitely issues with having lots set active (more than about 600 causes issues on this system) which seems to come down to a mixture of available memory and the number of search requests being sent out.
OTOH There are extremely major issues with having a lot of files made available, including extremely long hang periods when issuing "reload shared", but that's a topic for another thread. (the size of known.met files is a particular pain point. Why are they 5 times larger than XML-style mldonkey files?)
FWIW, if you're targetting 1960s-era TV series with upwards of 35 episodes per season it's relatively easy to exceed 1000 items in the queue. You're right that many times theyr'e all available from the same source but there are a number of edge cases where they aren't or sources only show up once every 6-9 months (or worse).
(This ancient box has 16Gb of ram, of which 12Gb is eaten by ZFS ARC management)